var/home/core/zuul-output/0000755000175000017500000000000015150003433014517 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015150005507015467 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000170032015150005354020247 0ustar corecore ikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9Gf\HtufX]]6f;l6? ow ƆC4%_̿f\ϘקjzuQ6/㴻|]=ry+/vWŊ7 .=*EbqZnx.h{nۯSa ׋D*%(Ϗ_϶ݬvGR)$DD D~m{]iX\|U. $ॄKЗ/83Jp ώI8&xėv=E|;FmZl8T*v (6pk**+ Le*gUWi [ӊg*XCF*A(-aD~JwFPO7M$n6iXύO^%26lDt#3{f!f6;WR.!$5 J:1*S%V!F([EbD]娍ԹiE03`Cfw&:ɴ@=yN{f}\{+>2^G) u.`l(Sm&F4a0>eBmFR5]!PI6f٘"y/(":[#;`1}+7 s'ϨF&%8'# $9b"r>B)GF%\bi/ Ff/Bp 4YH~BŊ6EZ|^߸3%L[EC 7gg/碓@e=Vn)h\\lwCzDiQJxTsL] ,=M`nͷ~Vܯ5n|X&pNz7l9HGAr Mme)M,O!Xa~YB ɻ!@J$ty#&i 5ܘ=ЂK]IIɻ]rwbXh)g''H_`!GKF5/O]Zڢ>:O񨡺ePӋ&56zGnL!?lJJYq=Wo/"IyQ4\:y|6h6dQX0>HTG5QOuxMe 1׶/5άRIo>a~W;D=;y|AAY'"葋_d$Ə{(he NSfX1982TH#D֪v3l"<, { Tms'oI&'Adp]{1DL^5"Ϧޙ`F}W5XDV7V5EE9esYYfiMOV i/ f>3VQ 7,oTW⇊AqO:rƭĘ DuZ^ To3dEN/} fI+?|Uz5SUZa{P,97óI,Q{eNFV+(hʺb ״ʻʞX6ýcsT z`q 0C?41- _n^ylSO2|'W'BOTLl-9Ja [$3BV2DC4l!TO C*Mrii1f5 JA *#jv߿Imy%u LOL8c3ilLJ!Ip,2(( *%KGj   %*e5-wFp"a~fzqu6tY,d,`!qIv꜒"T[1!I!NwL}\|}.b3oXR\(L _nJB/_xY.# ſԸv}9U}'/o uSH<:˷tGLS0l/LKcQ.os2% t)Eh~2p cL1%'4-1a_`[Zz㧦|k˭c ĚOρ_} Ewt3th?tvͪ{~;J0= |JUԍ;Iw}/9nh7l%>'ctRm9r}1j#e[tRQ9*ء !ǨLJ- upƜ/4cY\[|Xs;ɾ7-<S1wg y &SL9qk;NP> ,wդjtah-j:_[;4Wg_0K>є0vNۈ/ze={< 1;/STcD,ڙ`[3XPo0TXx ZYޏ=S-ܑ2ƹڞ7կZ8m1`qAewQT*:ÊxtŨ!u}$K6tem@t):êtx: `)L`m GƂ%k1羨(zv:U!2`cV, lNdV5m$/KFS#0gLwNO6¨h}'XvوPkWn}/7d*1q* c0.$\+XND]P*84[߷Q뽃J޸8iD WPC49 *#LC ءzCwS%'m'3ܚ|otoʉ!9:PZ"ρ5M^kVځIX%G^{;+Fi7Z(ZN~;MM/u2}ݼPݫedKAd#[ BeMP6" YǨ 0vyv?7R F"}8&q]ows!Z!C4g*8n]rMQ ;N>Sr??Ӽ]\+hSQזL rH_HI\:U}UE$J @ٚeZE0(8ŋ ϓ{+'h=TԫeVިO? )-1 8/%\hC(:=4< ,RmDRWfRoUJy ŗ-ܲ(4k%הrΒ]rύW -e]hx&gs7,6BxzxօoFMA['҉F=NGD4sTq1HPloҊ7!JCz]~SJzpo@%gq~.LK78F#E01g.u7^Ew_lv۠M0}qk:Lx%` urJp)>I(>z`{|puB"8#YkrZ .`h(eek[?̱ՒOOc&!dVzMEHH*V"MC Qؽ1Omsz/v0vȌJBIG,CNˆ-L{L #cNqgVR2r뭲⭊ڰ08uirP qNUӛ<|߈$m뫷dùB Z^-_dsz=F8jH˽&DUh+9k̈́W^̤F˖.kL5̻wS"!5<@&] WE\wMc%={_bD&k 5:lb69OBCC*F!6~ö9M( Pnuݮ)`Q6eMӁKzFZf;5IW1i[xU 0FPM]gl}>6sUDO5f p6mD[%ZZvm̓'!n&.TU n$%rIwP(fwnv :Nb=X~ax`;Vw}wvRS1q!z989ep 5w%ZU.]5`s=r&v2FaUM 6/"IiBSpp3n_9>Byݝ0_5bZ8ւ 6{Sf觋-V=Oߖm!6jm3Kx6BDhvzZn8hSlz z6^Q1* _> 8A@>!a:dC<mWu[7-D[9)/*˸PP!j-7BtK|VXnT&eZc~=31mס̈'K^r,W˲vtv|,SԽ[qɑ)6&vד4G&%JLi[? 1A ۥ͟յt9 ",@9 P==s 0py(nWDwpɡ`i?E1Q!:5*6@q\\YWTk sspww0SZ2, uvao=\Sl Uݚu@$Pup՗з҃TXskwqRtYڢLhw KO5C\-&-qQ4Mv8pS俺kCߤ`ZnTV*P,rq<-mOK[[ߢm۽ȑt^?bl\)cb<>O0BďJrDd\TDFMEr~q#i}$y3.*j) qQa% |`bEۈ8S 95JͩA3SX~߃ʟ~㍖›f!OI1R~-6͘!?/Vvot4~6I@GNݖ-m[d<-l9fbn,'eO2sٟ+AWzw A<4 }w"*mj8{ P&Y#ErwHhL2cPr Wҭюky7aXt?2 'so fnHXx1o@0TmBLi0lhѦ* _9[3L`I,|J @xS}NEij]Qexx*lJF#+L@-ՑQz֬]")JC])"K{v@`<ۃ7|qk" L+Y*Ha)j~pu7ި!:E#s:ic.XC^wT/]n2'>^&pnapckL>2QQWo/ݻ<̍8)r`F!Woc0Xq0 R' eQ&Aѣzvw=e&".awfShWjÅD0JkBh]s9Ą|ק_;%X6Q@d 8&a)a.#ۿD> vfA{$g ăyd) SK?ɧ#stV \'xMgaSZNg8>e!^f%cYr]qs:"̊;isXa]d+"v=x7p.fZCg_Ys;pE&\U}ܫSh])qKYAـhΔ`UаQ/ .D%ES*;OLRX[vDb:7a}YF30H #iSpʳ]'_'ĕ -׉6tfЮ$zͪO_sYq+q艻*vzh5~Yy;,DiYTP;o./~^.6+zZFD& m@WXe{sa 2tc^XS?irG#^ŲDI'H_Ȯ;RJ&GT.Kwj;of¬zHmmS2ҒN'=zAΈ\b*K ڤUy""&D@iS=3&N+ǵtX^7ǩX"CA⥎å+4@{D/-:u5I꾧fY iʱ= %lHsd6+H~ Δ,&颒$tSL{yєYa$ H>t~q؈xRmkscXQG~gD20zQ*%iQI$!h/Vo^:y1(t˥C"*FFDEMAƚh $ /ɓzwG1Ƙl"oN:*xmS}V<"dH,^)?CpҒ7UΊ,*n.֙J߾?Ϲhӷƀc"@9Fў-Zm1_tH[A$lVE%BDI yȒv $FO[axr Y#%b Hw)j4&hCU_8xS] _N_Z6KhwefӞ@蹃DROo X"%q7<# '9l%w:9^1ee-EKQ'<1=iUNiAp(-I*#iq&CpB.$lٴާt!jU_L~Tb_,֪r>8P_䅱lw1ù=LAЦz38ckʖYz ~kQRL Q rGQ/ȆMC)vg1Xa!&'0Dp\~^=7jv "8O AfI; P|ޓܜ 8qܦzl5tw@,Mڴg$%82h7էoaz32h>`XT>%)pQ}Tgĸ6Coɲ=8f`KݜȆqDDbZ:B#O^?tNGw\Q.pPO @:Cg9dTcxRk&%])ў}VLN]Nbjgg`d]LGϸ.yҵUCL(us6*>B 2K^ sBciۨvtl:J;quӋkKϮ듃ԁ6Y.0O۾'8V%1M@)uIw].5km~Ҷ綝R(mtV3rșjmjJItHڒz>6nOj5~IJ|~!yKڮ2 h 3x}~ے4WYr9Ts] AA$ұ}21;qbUwRK #}u'tLi'^Y&,mCM)eu㠥Ѻ\a}1:V1zMzT}R,IA e<%!vĉq|?mtB|A ?dXuWLGml?*uTC̶V`FVY>ECmDnG+UaKtȃbeb筃kݴO~f^⊈ 8MK?:mM;ߵoz+O~e3݌ƺ(ܸf)*gCQE*pp^~x܃`U'A~E90t~8-2S󹞙nk56s&"mgVKA: X>7QQ-CDC'| #]Y1E-$nP4N0#C'dvܸȯ.vIH"ŐR ;@~y>Kv{) 9AG ćͩ$.!б~N8i"1KФ\L7/,U@.ڮO?mُa ې!rGHw@56DǑq LA!&mYJ*ixz2*{_;IYJXFfQ* 0kA".mݡ"3`Rd1_u6d逖`7xGMf}k/⨼0Κ_pLq7k!dTij(o'ciS<{1$E[nP b?8E'xv[K+E{,Qƙ1*dcs_Z'407|qBOgYU|U--sG8`u! qGYܷw;ȌCPc_|(RaIBKb+{P.T! =ĦiTob d<>SHr][KqWs7ѝBYǭ~RR"p9dFg|K- obY_vM 4>/]e/dy,8!xŋ5 R<^mYo 3c9(F?hy:1V(!L7,RPEd;)QϢ +RlWDžuF7LFֆoM~ar*EtIbW>jqour?qzJJaQ#-n`/$fhnqgTĔO5 ꐌSYXzv9[ezksA`<dkON৯s|&*pNaJه5B5H:W2% `6MRR'xZtfC$1aH_dx$1'/v^ZZ4`9);q`F"d1v>ժbLGd~MP%m x52LMF9 E"A,S Vo}\"X.2< 5FB΢u.`aJ#Tk’"D#cuCXȉ4 ՖK(KP|dZ1&8{9rLnMRф%V Ng2K|`ot.GSGd oE'!B'Nb1{8LW^9KbN;sö!`0ݘ/l+1L#B8U֕&*?V6N{դ}Y(INBKhx2 *MOenT.a~.E jG)j{=u^K+Ȫcv/w#MivX :)ǪCZUnAS`SK6OSxa3 W; K>窜̀'n 3u0?K@BS %fee}i]>̤+*l:\歶!IZ5>H;0)N.w7ߍ|+qUߤ^oå~4en\.cY[s'wSSۘf ?.D s}Y~/J[}jX^ޗ_-/̍ݥ*n./cus}]\>\\^'W_nAqC_oO-S_sOq?B}mmK2/@DJt}=xL@5MG0ZY,\S Eb uw:YɊ|ZԘ8'ˠ*>q/E b\ R%.aS qY>W Rlz!>Z.|<VD h5^6eM>y̆@ x>Lh!*<-lo_V684A飑i2#@+j3l૎S1@:G|gRcƈ?H(m>LC,HI~'.Op% ' c*Dp*cj|>z G` |]e*:nq!`{ qBAgPSO}E`́JPu#]' 3N+;fwt[wL X1!;W$*죓Ha-s>Vzk[~S_vD.yΕ`h9U|A܌ЃECTC Tnpצho!=V qy)U cigs^>sgv"4N9W_iI NRCǔd X1Lb.u@`X]nl}!:ViI[/SE un޷(ȊD0M^`MDN74Т C>F-}$A:XBgJWq&4ۓflq6TX)ى?Nwg>]dt*?Ű~{N_w7p682~ =WBX"XA:#u-9`x 92$4_>9WvTIj`+C2"s%DƖ|2H\2+AaTaBˮ}L@dr_Wfc>IdA Od[jlec=XJ|&+-T1m8NP$%s,ig\Z:h Ћ߉n!r}_\ \5 6 d#=&X^-kOwĝJO\Vj; )!eoB4F\jtctUb.L[3M8V|&jZz/@7aV),A[5TpUZL_?CU0E [%W%vl x٘3܎y,< )i7 Ո: tC`\?c%v7\Ct!$9iç$><+c~݊lz1H[E'2/clQ.I`AWOlw&5fH n`gMytdx)lwAK~GgbJI-tq5/i ?WǠr^C/1NEU<=co(k0Q~wˌ\g,\ rf\PUH,L#L7E"`0dq@zn~+CX|,l_B'9Dcuu|~z+G q|-bb^HcUha9ce1P[;qsA.Ǎ-]W‹y?ڕ^Pm:>I+Ȧ6' ,}U=̀*Eg.6_~OJ/8V ?ç&+|t><,BLqL򱷬dS{X6"X#-^䀕#{К4i̎'QIc(<ǩJi lc*n;YKOIXA|i*F?mdW#_ֻF6 NvX#֍V/NiqLuvml9CCrfkD E\yvɎ9Vd[ զ(eqJ/]idXՠ|)0?V]xDǪq;,ʸۧҎM9vƱ*J%.E/zħ4G$}}OCBztpzke\kg 0tGQjAa)#(oD1`7*tMs=)Bɑں{}zqL7]K7=s̟"B$?U込cǸR /w-o݅"g*Q _}o#On{d<!A>Ť0cߕ@P#48P@&az]z~ŸÁe퀡,ⲆPkEwRe#cy.c#m#4̀YF`~$e;0lpc@z(4|f5Yi~Mlikִ7N&3Q1(DH,)^Kjy\ ^UI@|r2]|TO.FU!:_UG `Ru]T'tWa5yu]MQ- u^yHu2rX K-p2~tVڤRDM~0Ϻ0s^xʣz%qK (!EMx{s45@& Ohh%83pnP0JGx!/@Dߎ:B;Kxy4%";=͍ύMu}wr0MktC 'ȨUN# ѳ4Q^'U3Nj[zNv{1v5w䨍W3qr`p:foVG|.x~Pϒg #V.Tz''1?ňqȝRo3vxC5j2tfYhm"}=(뛼~YhVg 1yZe)/{l c k Ⲍÿr})@x+}IaAդ :{:Fy q篖ηp=[L,!3 ck5<< y~ IR|n˼)/)Pw9,ks6q3P!$(MH7>\;C4Yo;;NB?}ڥM{/:iUfK'͓u@V>B+˾V0 Z#b\:y[vƋA7U~iq}@biϲw*7 Or.$͓eƀ \B.ʦN, eQ$lX()\X`]ikgz^2 ‚˯we B|>B:D#ߝ0;{JnqI ڐud0Ӏa\Я%+ټɮ"~K`yt)w4(lR~/ɓz~6] Ik%}ljŷgއ$(~tCJb[[[>_Kqa,4U۶ӔNcZ'Ƴ [f<-l6M`h[s}B n~qY7@Nx^<m?\ܐLgRwF~xHӸ5QPݻBkGuFwL㨥]VK ⊖i\:Q֛A)&{u&H<ϯOO"TDZ۠^X !\ӓ7 _F0C|3((Lc4MvEc+?Ţ=}z2)]߉lwt>U gCe[Yl'l9j*"\MY Xq=  )<X쑐oHwC{5(OwT6 <g;CG ojJX֒Ȫ+r`T-{GfRjuɳ*y'g<Jl74ʪĎk pXR3I! Gۆ{H}oW?L"ZYu4 _q ,{i4 zoJSLڗ+<ƴ[D[DDJu<<WyOh;}6Tz$4V`ZS+ FIS7}Um6"Qjx9X(E" _ߑr[XbA;<:ۑ<z۬{W\+Q4N8VwZN4,Q"Ep$rRőj)3T&-BѽC>9M.C!]Ge==mAJtӎ_j,AR:>隱+KeTlQEg+ WțzQh <-VrwUIiSd M s Şj磻ԃ.="8=YQ{lI?f:siT.=O%EZX}S{ {tI{JIVaaκ=$Τsxfo} -gJᵡ wjp<|pAn|HŤfiLkM2z}cw9=ll@Va1TR'D LOՔt~ʪ{ko/&p^b1cg޹آy hUzDqUNĪFTNr?nXa}Vrʽ`mYqkTk뷯,yzGwG/:I8S,>ϛ>o/ׯOYAz *.] jl8+vL!MUy`uq ծX0TFb繝3Ȉ՞`n#?S0Lm{ ӎNYG&l;6;eM;fHAL~ȇ;2 ]R _S; cTa\~Iv=me_WPc7@yvpw'DĜp ?Exp .%aw R8.mIS c`9j7L8ɲPu/UpxB7lr-AJg LB[ {1*V-]!=݄+ sj4zJ +쐖,EL9R1N[P)=Ey4@kx%j.T-D4pOhN2ډ.}o.cux8ADdޒCW\3(&yQo( ǃ%9bn03~i\2M7.&+8@kQK|I [yگ/?X!hL8~Q;}渲OxKcQ&O-rh^F:O # g'c;PMbڝg[J}Or1qfFӧ4C;^3W\VyWIb10/ z'p~;RMnnbN<;a즶A n/6_=(Uʂ'岎M^2dCBh=GdHm.D챦фtmWHǓt Wȯ' 'B?"dMmC0F%E%{Yx=\a޲+]H^2~@>Ĺz~(Aegp@<}h4qcd ,FwQ4I`#Zh$KTW ]*frF"NCi>kWt WlnXϸ]HQ:iiHG"k(8ꪨ&P1mcpvX˗ua;pv6֪np4zj&V8+x[Qkyf[ki(_moFa@5[ uÔL{pMf+ (>7 lU~SÞV߱qs,tMɻ|UfIz+b+ڢN]e2?)mlFvlϭd&z$_ӛt2`kH uVTpʾ"4h[P<ΖTT' &)r\Dou lPlwBe{v!tmH.cTQe+i\vgh;{.kRk@;HށPgPgB uG'UB #I(߁PJ؁P;q= ;@;u$݁PoPoB G';N8B= w 4X%4؁`wBIh%]84 간K ?M)~6Pn!&<_=].8FՃMVFxBnu*Gީ*T}tFQa%qY`4o ]J*ULHq3ާ=n{}Av?w/ΏyQ6Me]hN^""+Sk6]|W3pڂڌ¿G1N<<;iE/* m4\̳ 0䢎SXd0+xJRK yh0Wß2MYc`ql}W'6Tv+85cT9~.X-CrOe`h.emt:G|4¬Yh[H>3dCa<QnAEoo^XU#IRY( @gq:axrJ@(S;Z>ea x *_5Ro/,-"禔2£ tE3|e@Z΁yjh11SiLT/KZ{[?u tqPyw,OdVb<^+ijeS:u:IhC4wi@y%'le{%ÁPX7RiYh{IWx BLv #cCE0ˁ:ͦi<qVGȤU#b_* [ V6,J)WM9HSLQjdUL Z쬴p~{1^5tT({ ekGj#~#]P^e-+N,N YDywCJ|WKoT^Q$d[@lS۠*KOQ>@shY3 ׬ps8h̪Hʻi,ϐjN#ZB #o :/e,DaĚC%lN}F=,H~9j9[`Ĥ/xhJpNe^Ӹ >v!5 t7L5+Lxw+<<–>[P-};Y/d;/z_\GypIx~"Jo!]-E(%cKE:ŭ5rb^S8/]Hl1fm9,zdmAb+ 8UO_@*+o߄ݮhr9Cmg|'P cYr`-ph'J[mm<2G@}ǐᩲBGOZ^<l +^]&MU1ts"@`y'$W$!&T}Ŗ?3Kz%ɲ.WS`E#2X2R1%\s LbYu+Ͱhq۰U ʜS{k^6*cU2ڬ3DlYLMp ʷ ҥ>I+t 촾 m?}}mg7|P]Sy/ }+}دcNz&Zx<[rZ!\ONa=pj$EI,yP<4>Z-)CQjiha7jb2zYLUU4͑YbO`o&L&iӆM趍I~@,L}iLޒ^^}Dz+O` N;%I48Mg*Gި5 (߁P6d/o{6~[3@xÑGU J(ɻFWY} XxSE4ɑfjk;.[m20p"߇]S}Vj#cm? qLŪ|逻7bO?~v׉=9;|^s?|kgMo@^x+5 f鿺O7xM6Ho  #+|w/.%7v.|cwvГh':-׾=uqrzx<gvMgʕ}Dll{e+tͥR"  |e~Nqyѽ[(`%5* τr=Y {֌/άhY}}ʔDx>5M 4veKy]3G 1\S^HpcUJk$52 %1214)aN|A-WG.Y,4NHE0QHPC۱jkZC[Y2csmN@0غWzO;Vb}ZB̟tvтՠ34HRfzp4RKFҕvJccMNy]L Nzphe .$WN=KK"QCfٱ\ Բ1g:D#̠l.hZ!^E ^Al۷mcİCh+[#,osQB{s # rdfDqùsҷX0_S}p^ 8nx$q,Ȟ%)xl1^ FTsfAxIWc{w$8J5+#%`/mGNd`@TVu&yc^+MYLOX6cio W!sVL29){@OqF!:eELl SbB ";0;@3ꇚx٘( ك/0V r֢΂$Wϛ:5qa$'WP/_ q\|kOv*n;$qiո _`ZaZ!Fqo3L $J3Xoz p8Cz.G5MwzpONp(}ts!p#:2#3^TV²P5˼SgJBm,{R6!6f':B FҕmCSd^P RG)bpv!T2$rwh4Wky<{+X҂Xi]oda$fg]J$ = Z/eyޓz^t(Axߙj8Sp%(p"İ[RkDHv\.$8Z-^9$;6 ;DB4 1ZUi8;4%<}FEMLE F [?2;),)[+O*>-cmOERUZ~Y @hƚڌVU$ydʥ?`]IW.N>{˂.uk7Ta&3+uvM:Wq<IEOnb7jpBI(K,ИHĪ`Đt&m9cYx<˳ol<)AU%ӊ,ށP`v~Y m3:}1$z1;h+ɣbt6i 0r<͖$Yh]N(; gE5p P,ŇBv [}X_] $h TTY!52_#UImbAHW/{73^_A[ 8EEAMW9$X8fF k&%!@MjeB$ K>!$r'8?|Oȑb5Zc/)ϴPisXb^ͮ.Y53@Du2P{~֐[ Oء: ɱrDB'-Й^?xV;!Xa^- 'nq^w^AApG ,gM,N1' L0\ U+BHLH6`s5#ct`ĐAYe`&Y,L)1-ap}0iB&h>y 7Q~{}|VTzCG{ $SѤ C}tNQՖ3[NqD[^oLw^ /A6iSoH컦[ q=I8a=H;a:uemןuKT'DRL+*ڢuc!zC_E6Zb$]7' `*D1Y\W^!%-11lc?U9~+ۖ:Hɬ%3u<7V]hĊ 1TH3u{YG/W S^mCιx ;PN>=cISTcӡf!wUKe&jM + Ĉd҉Y1CLjVY~}Dʆ:~\m4:G^ENXk{/A&8Ae eHl !S:\J;YvAwYpShe .rI&t EM1DQW<)zeH0j0­e'ZYw\0Q{.j[*a4;xdu8Ŭo1"L&k1c[WF6Goswİ:̊q•l|/K$8Zӧq\,L 1ZT h`",h"J.I'nS"Ѿ,+@p4 9$=i.h(SU?pY`f⍎Q̤aP12;U:LRsYDITa4'CfYwl 2O1dZ>9i?:bکn)ƺb/7fZN Lߒ<-#: Bϊz#=E/w8:.g59!ƌw^~![ub^[.lʺYo?o|XN j#85 |36t} E:ˤ?g2ILLFIle;Dk$H^8-Pr Zҝk+>2'Ds#y=:Q7'y Mm {&|i?]gq8*tՂ*EWͮxOgeo壞4!(l\}< v;[D\jd-Kê͢Ŧ!Hx]MhZ 18ipʪr *) ݛDӳ;:]#RvVܬ< ={E?@cp'jj,EB`Qh)Cɵ^*&Fƈ9 ۮl6{i#j缑3CjtDg-njL}2\ItuQw?88z+v_jzXXDtշ5bNyL9bfy.Ճ;  g 2q-$|Vn a!uTodUȜը4;Q&ZJٻ/]#:en j#jjN~m4ڸY Ԟ=4nq~%ΎM-:*F*1ƈyE"r03ȭs9L\xR= BnιXct%jp΋AMw^ 3Gȣ^8YoT8!̈al2ǗcvwMj0]q޼ƵuްO}Exa<|J\qYȞtfV g /䍦`<:iTHp#I aVIQy#VI=u0].]FpVk(ӂ( vj=Fn9kсbeID:OI  (t}{PywqOmZj'0bYwOK_*[/Wiqw_%w彍4l^-އ!fe/;02䏝9"KZ3qn[-InYV [ZdWźX$+Rpt)=qDYmr 4xVv6HL$oY &*&s} r8Fn4uޞ~bqYwEhFQ:On̬.>gL&Fک8Jy#%h4ޢ77zֵsz:{<߁b8SXvqf?!y0I|Y&0 ng%i 40ˀAi}yȉ0LYܢ*mQ dzLT˳@t'7GG0&a2FH"c>Q='|SɣW3?ĝl<=vt Hh`MPW}݊. Y#$LG AYuN ӹqA1FXcLEn$A8^H8èm4 ۾k b?M*śd4y lfow#?<4aoWVI(%&L1[Iʱڹɿ{S>뼃wa1K@ Z3k<FO{$4#W={ujqc֍K,W?_9MAe3 C)w__< ~r166yv.O(Vm[^ ܷO}.{ӋB{ rXn|jZZW(XЁ[zy9d*FiDp$C%jijgAv$Q3z;#㖲 S/k](o71Oq@`rkD+o=UVF`j7b}`=޳iB?! 0k*ֺU$'5ȥ-RfTMޮm4VI]Ao2dD VE@0-d0BK@hj2&vln2AU E]6=M?jDMCy2@+ٮ9(pL Dhj% ViVZ'/'h0Tc4 i(lm?JYm(?ZUc´ m7/Nwy406+n`̠a]4z v 1Wn$PKT#yH%L5?jTz$й*N[?t3N`(%u#D* *0wHsIM&ⲹ嵓שumf8bHv̅E澇䀙YPk<i Jj'9w裫Z]([9@WyH` ]CW=\9ޣ2h-(r( B2EK,@ uފkg-$ |TE2;΍#RdG~6aUF;FzcAT3jI C\*N4Sݾ_w4|c`cN:mSk;OcHFI;O֒;F/ѳhw0K+5.kn.=5pGC k9sӂ+rڂeL1GmZ۴pխYԜ˘i4ht muzW:ܤkh%PZӆY%,vƥ;%ڒks$pC鍗XQ8\̊b5gða'f*A _iPc _MG`ʂZ s7⸰ffOpnߍx۲É8xd[,3 |ޙ&T/A>#v „z|M%>y;ЊV6l[9=HÈ& en޵{ HZ_vl̃D˃+g,4oyd[K QTD"eW6(v,f bA,D8 q75HP21B\ky0.$jbo&ЃG{ J/"`|W3n! t0$@\ڳ^w#?<4,+FqZO:il<X=ရլV< ~6xmxn6B=WQ(*%{ҏ#+qxSlh/h\|}9<*qwtORηÓe |0|4>1|f\odӀI; N /`4# Trה2@ܨyF[,9^. h6b"<_%j@1ZI 4B *8/s{DEoUc2()PE 2T@4 T+.&~߮]`InjZ~k՗;vydߞ`ߍb) wu|8JpE8{0W>Tsv.'@Kk"%0feZbu<W\#UȤ\^؎ nD~U-e{s@?󒵅)F@'R 0;x`l`!=UFj$5W|Rl1@tjEv~t WBFz|:c(P!LP ޾O ,Fp8%AbERcYKGe8³@8I"Lu7{j2|ML^ƈ';HoG5a"W1$ݸݶvhY*Ĕ+!ؓxDH,NA8 U4 #F06Ehժٲkb}ֈJiMI֌FQ$nڅe}Gr10)*<01X_@0Kn ǟIZXVyz@pR &^[ZxɨC4o")I6A >bjR Cd;RD )5R,yhHʉWMyJdH"xWҾ&Q"HY9N?`y֤ Sٜ ֽ[iZyբ ޢB%?y?,䑼A~,Tc!WN@rqh yGq0!|)cj& QV |K :I!Q ^D3[5|\F֕ؗ2|RDx)0R'^cZ7(uv@"4;r܅%xio_u>+4UYT1QpD8@]KxmASEEH q aDOyTPFq8(b)f_~NN8/R%ܼ`Žyd 2jйK?MBC~wW؄kʧ'kѠĚpMpa )kf. J6H:*~XImqtA@:KKh)YUef5JMhɚu:RK/`x(cŇVgϦK̔jNTD)"g%B&`a?UO96Ki" R Q*^eC\hE8Qb'TWSkMPr@VI$4LLa܇ 18"E*R(Vg"8 bqK 7!j0~Saĕ&T*Ii1R(Hb}$h&EL"e% >))QLTO 4 98Rͩ Kj#MwI?"R~*Ԕ\ +lRϠVT)AȬC-{ T"I(S\!2ܜ*JN!yKi?Fj JR8†4 BP 7KpAb tH$(%]*53L hUs!W9SU J>LVӁ\tx)R(EPm(^d*@f'_G;#WzݫY m6L Qƚ @RMXQS|HqGh5ƘP+ߞfN8".[HC,n>TE)y"2!X#T'5(SKx"}0>#Ҙɓ7CTDcАEcP2H%f8WVk5B(䍐+N``icպ>H  +A$h@]>d |=3&.~ֽB9K*lJ> ,IZ `#C2fQ8}V6YVnzٯ7fX7ff8śQpcB3|3S̱1C2Le?Z% LGk20OWeKI.s ngVf~%1`0͂E,1_06D0B~]?z]s.=mf{MByZh^#^<-W~lH= g3@`Y9yN<Jzlpr0.#縸" o_,3̓a i&eKwfTUs˛C8md W\ ]zyK|ia-vkl.,ړXF v̓|*(|Țaj6[ΆIryv $ !\J1i>F?8ҽGYq7M}31\|?Ưtn{-Q_?_3=fJ4$nFk_{_ 2~~:`_G/_rfi2k{p?g*_{@?UwӪ..jp~ۻ߆3B.Ϧb,.ڧ,S?@^~A;,_52]Quɘ( ]r) zz_ 7 ;^]$H[AcO2rbɌwo@>H/L)~ǩ.n'sX!w ˅0 >ᛢY(.na82 11W01˟ <\x8o^8{~Sv8;~E le|ek82N^׹{~bb墓=k^ l7Fiɸ ԃ&@Fp<?&, nvH.j8@Ϛ<~(,3NLUa#.&_lqwνF|:H*(%=knLO,BAmC?爤\⳰pl7`ѝApA?lKdyxdn>i;Vn"I^w#3iMC㇫{ܷ >QMtkK oG `f4|"oWxIҊq7LCrrjX tݏʎ`+_VHr5F yb3K'i-Nb:Pe;vw}}y]FC=t瓏83O+_ށ(z)+DW>+w^ҹ;`S{]4S}TVRêJSP.SpƆ?K&fUQṰШmv(vO ]6Tǧ7MdTWem_^jtA<A (@^ᰣYs&EVs+&65'܇مU }3b~6&j4<$ZUuh|tx%(,j["S3U;n OrygݩXSW]w ZTvNn>qusDǝdYLE'_؀M/;o?$wU18O"dMGMB$QQJ1O l$;܉ڤ&FGhS `6?_.(״rxK{9A+Ρנ. v;5aopKxbg d./ݻ친6InT Ϗw`1uv Ȕ\׎G'^W/jGyVCS}Au(~L?BdiLeE08;$6n}9қ't2;azn304} 4hb.?tWO6q`K|^<_t:#&TLREd3Y~ɺ<=MNA%ҟJ詄O xq)Dg~p)]oH 9TCI𓚷3kނ~(TDb#Jv!0NpqTV@񯙺$dݏz˜!S[x|΁6~VHzs]*2UasC;us]*WRJu^`/tw !TZgNS2H ۢf+"NNӔV B' (oPoִ1 !ݔ\U^ Gm߀\_;,moT~Q]&3V5G:ʶO3R:% ~Ezk>E-Okbϓѝu/{5}X}PPܱ"GH(&As5TubҋL-XNՏ`ITNՓ\Nju>*,a-Wd^+ntyvAݝ0ؿA\4re9$ǎb^5׉`?'..4rHyd= #Ýu*5aEҨ^$O*">=y'AkO\oJG@6F.PiC˃&~N9\] +,hj*x|M /! #z j%-zs;Uͨݘۀz臛.!XEZm[-ZOE+w -!=@~?l ROUVBy}&!oQiJ'#E=ETZ?1 $+-S)v=wɇLMi^3R3'L;A-Axmn~`1y `fXnye{k4t' 5JD!2!}%iXJI#L%tʔ/Hc Aq0|/-6hV v~JRcmݒbG"S:^lٗ00# vBp\JQ~dюEkp9z{7X`XvMm) B{qT=n8Knq?}=*S|| |8,Mͷy]2Tᔚ3)LkT~h̩cQ>*T2L)BsԧHD_R2)?ł%j/rX`L a+nt!0\(mk]_`/8s& -ںMP~hX`uU+_yq*..d -@ł /a7֣([ ңZ#Xe?rkFAPt8?zٙ 8xv MOQ}e11и4H/<<*Uy@('$ $c {u*pJI´rG)n`%U7An;\gi %øCCaYDq8]w#B15IR_5 7N,/;R/`B}"sQ@C7[rAf0.?d ~lzx2~M!ʡbL ܟ?LZ>L'wp{:2l,;`1JK,WzQ|& "-?"j',Nf:f75%n:|d/g9۞lmܦ)벲HFD lGV4:$2d)J;Ir$T[`2q0IIʯ$%K9jjkwŐQ:Zg12)Wg!yG0Ō$xƹ"1T}]qQ>%)~;NpnfB]Kk.6e/(\ Ț5T|+nH^*.C9+LhUavnLh ʶɆ2(Joɕ0US5TB9R%2+3RN<VVM@F9"m6*C|ldMzۤIqi1?:7UzEMzۤ9IVJRJ򌬔bpIo^ r#;F%xPI^fXrdbQ5T5TkaVSUP.L %U[iSlк%[T :w9i[V^rb,9٨m$mf$mRFr ]l#iIIxrI9٨(5GQӦ0G7֦JZ7U m&MLzMwJ@;Fr#c"E# my3E7 |<~+n)[?Ubw,R%[7'HWJFIt-N[uz} K~N@ځ+Noqzߠӫ(LJ! oI-NszYlFLHN)-Nuz]Ta M@[޷jRS[.NuB*޶yyncV{gLN/#>՞[Uw/ykeIRe"E A/ N=&'#WsW:|6ڗ?ޟM9el< ӻ>i\muQ)kNke#;"@_MCM뺢j1DTXdNu-|]sOK=_ps'U\BqiY&CFR D TyfNtOԊ[7DE4GJ[;o۠GVk}ʠƦFՑjǡRVEѵЂL9 T1h@0R&EA#ӡwngDe9K=bO#et?8>݆=BYI,/%A;B?o.}ܙEAh"5,#Ado}"VI()֋-ʟ^/b/6ö5a_O| {įKmd}1!TE2袧@6Mx\JZYmTX5XʡAzAUp_`=И,\ MZn.>E*JQ߈=\zZ6AA!ƳkWa6A7:6O ^tOS-~UcZM9ѫLT Q }eSB1C?n=bdԥcZe4l# mH 38ad:!Smh, AoQ-(#%Id2D" ^q0xtARc a88lfCHpZ #n 6& -q*xAڑTI~{N)וextd2ޱou7pÏm\GJ[Fdabt0ABYyx~׵Ø+9!v`TZxuY42#5*yG %sQOqҒĂU5zUeXWk'} 1 #=Nǣ8AɃa” Ǡ"1 bF15gP|/>TTx8OfVkt!Lh4RIx * oLNhi]:$8#a#i\Y Fأ2UTEh"zĒl=TzV`0ɩ023eyD’qY2%2Wd;04 'rX2ld:,_2dK>.ק|Y/ZL{T{E #CV||R)):kiC@"FF#gh+$DF /Wq ntZ)D|@ A12F&HuaAkeK]ݼ/_Xh_: PF(Ԥ&'7a3H4Uلfm}m9n:$ZuEFlL\ -T^*Z޵\z`Qh;)zPp028eyalsFt`A#ӑiLI߆(~~ʟ_]B5+QYk6| 9m6U2VPs ~{`5OyGqcfajd0ec)tˊ@HigI.urXnH_[c.糿b^{':HgRᙔR.bu|lg}?w\tWYŰ{$hTLw"g rZ^1;":w|e5)] 23Rtlfxu˺O iW}n> _Eͱ`FB eϚ@hR {Zh߬]wI]i١5Vj* gRVN #  e~k]w@L<@  i(^EW}U׫5izth1~ѻ2Rird 4ХHÓ/ƞNgx"#D .oP)Ȃ֠@K;\B Ti%jJ(Ph4tdљmqĐpK]D@-b UJώ~Ԅ)#E*kBp ҶFcG*8 D9E܈d 3cc:qbq}6.(֍;@T򯨣~\PF,  BJ۫  yA@^ D[[Z֙ d D%Qr!/= U?J) !FgtxDVNƒ#BWӎv2q;Hk5;$K A#A ٲ1@j_ye4eCa@ a(2L+gl&T (#C¢gѸDM鰷P (PPlPP8rϧ*\/J3H@2V;ßBՠɠ %y*-"ewo}y0ATXd݃ 3ad:R P.P/ $H6lc2 x j]Uv{TIN[GV+)DWG.ՓqR] jWYD`Nb ftCZyS̳#7??ȭVhm:s . 9F&#򔳫WGZD!n8DRj(gE5tD<$>%U=a:YVwZo^qXUxt]9q/BІB0eʹU[秄7D-dKu{ezs-2XB7#ϪiMۻ37oIo5˙>pS7[i~/W1U 6n}YZi|]smEZU+o?3F2%'`4i #r![O5{ZI+IsYIsRe' jJ 1M:*Ǡ1[1 }{:KȃFY,"ޔu>8Ϊ. (betFw4J}ɺȺg8M.M@Co 3PJTǛuvN(ɶ{78Cm-3Aα/9*F&Tx%@Y$-yQ$2YV_NB#a#p*.XQe.]#?й_ɂ&poQ}CbZDtXL,/X )dr%ݞըҨ@Ja22Ibh%] #rT$3)i0:;jq-t$myfID_vZn% NH$mkVFsM4Uj iA irp`?!F. Yi2hd2\jer)d Mc/8hشeoq>_đ8*0|6Ez½~.ދ 0M;w[|̿orо_ ),$BoF^z!-oU-o!7#86[=/VA,/Ȳ"?d[i?Y=嫉3%TsV1kz6.m6.} q)Ok^OS$T[޷b'X9xǶq7d|%@q,63Eh?^EC\Q)IUE&A4n# pSŧ|-5֒@@֌Rkmk9y{Z3 CK$$b8aRgLc0 {$Cq@7e{qE*`݂(tT3j=#=xjom5z.xa_#ֲyYozԝO9h$c ][Fz E5{Wq/Ft| ˸ )H*#M1 >*iVXɮ? })~@ZqhaODkҼ*5l0xç|b.l1}`r;Ip?ըWBvkW9McR"{]uw*BhvXtjn^&su[#m=‰TEzEzEGhzaIBr3bN!k##\)P'eHC]Ҫ@*2vDwuO|#b'j&m;XM&JN)5өS3-E?{;4=*d{ Ҳ>2 a`I,J)nf0xIOo߬eѲ㊢1'v2p7 #Hyhfv|0_FYv7#\ƇfLzh }T`ɛ%?V6Xb{`9))Zgb!G̼LJ?ė3sAo W7i#cdJ{!Pk"`G^ȫZav4CTG}Z,y4I&K1 l138xx1ȋ>(FwzI-mzl&dǿp ׋Q&/]HW$?ſ #wq.9}~eSN{SERNr eÝ.z4I6^܌!t=z?xc}.;˞HzOS,ץ_7ZuzJD]<1h ؔ.6 oEy$(4i4& Uq/{svgM﷋r ˩U]-m$k5;rF/H}au5L:x&{&bNˇ%&a%|ɣbe/n**JLcEt.AmC1!BC?;2'"=c鉓Q28ΰy ٢a,MK>z_7qbJ;Gj:Mdt* 48+ZT D*oa7HSe$*2uq(3^؎ތ$d#/w1\(0 ]~\sA:dVs pŚeS`ƛkXO-fz%\1SDc=/`GWTE݃6i0Z05$5U  /c :e`i8O9NeRuugGᤖ̻A|_`scY'։¾H4}2wW]9DtS{2W xjB)C"$ Zyk>(Z!80c酱烙GrVwsHp1ǸkHL`I%>$_γb#3Ԥk!]_ +Tj&4TU֤AKUDVi ,J~c^FR7k"Znj!H%kk+sFg:0zB-4փKK)N+69 ԢT׎?1w\q'qY!ܺLZ$&l|?φ}g&8ƿ@J]}Jo9VvåOTߪLO,:eOX+RA\q%fqi7"jv6[ L_&G!⊋/r0e/duX9W怘YܿZJhI[zq, VR{Jb׫ZngW?֪!:0|EKN=%*r΢l$P }E\q\J Ecԓ^@Tr,XPFT,"T#ZURQ}nLׅ-3iڡ,6vye75~Qn]b}ՂֱsE%xRq)2-ۊKRqJ Zh%VCb "*:񥣵k.6AkqLESQrӖ<2(Pzrv`sNc}AH`bQpQeM׃G9AɯT,\ q[t BT,\ paeoVbH*\YU#*!JT#q^,r^T#:_(ղ18,NX hi/o3Jinqʚ6;k q 暀UûՄMbadHbiŹSJ;N)!A,yɔ6:L򭆗N+onb'^-d 4fph ϡqYڑ0&4O:NˇPobǓ(=˽Ҥ3{pqH(^`pO#8S3Gp~W,wnyɗ:>KYB}?G5݆8Gt#N!w8o1t:ADYK.!^T\qw/#.0 K,/L>J \^ /,.i{K=d%fzEt`_kK[6/N)68[wkLq'+ѯ W4t=v?5Pq$D}6BiT9!;Fÿ̬ Gz̖ R|`->|SfjM^0z9p~1hлT^Җ0lH :[:HQ_ o;OZ1̈́~xwf29}9ő1Qwq1fO~^5 by-$ ז'[V@7eRv]vo_?SJ/wj'ϝ#Ɛ!$DymT̊;־2\m[sRBsC JL _(T~f?㏜?ͻʤ}N{3ww# b l}-B>и(%7gSP qK2;N*YQR*+ڋŭI/X4j+ nqs[ɂ;yk:⼧sk^| -^u񍒾3PaӂYe>OB8Lkf;4,S9:qC eL?f1T- >^8?X]On 2RUX( ǎdXd:nyQA(ņ2oM8- (/o m{ݚiVẁ/CeF`EH f^Ln&Q IEDƱA8Uv\+.Hzr$3St&1K)2o#pi( nt0_BaKؘ ma% TX9PωI "[pCL@; m6 Bve(&NVzo % La^31f8-a2<2#b'Z lG!Tt)έ2pr#{9 ^2=f#բ 1@ A! %! ":ey͌AB8mh4efRGؽd$(4AgKhARH G Dh", U0,J@Bzh*N A`uIm"~Q=BXx "G@#u45@Ҿ7~pg_m2RRZZg *˘S_*|"ex!tu GX"&3 9sȃ+;mV($4 A*ɏO``NɇVc@F 11xʌ[ 8upYO tHB!Dj~JHdX \@9$ė 4D[`QʽpuDdNB4rl'X',0=_Zr(lAGhB*8/d d&h tT@x )x΃wq50g #_fp"i!tXř ` {-b-S1>  wG'U(3,CE&\%L@[(<3`[Q-l ,b YO1ƚ):k' 䙹*慏G$ye0dB0K2$iuV7t.dR DĂXe.v;sd `&b.\ E)`U(ԕbe< 1  XXH<88 P(O  H) ˜ßS6ߏb7mv VGԖ\ÒeKeKң:DDC<$J}"њ{0M<Kf9`)7 wyLrI1vQ6Aw|D8C@ƵV@C%X""2HTբVBHuޘP3=Hxkb"N c@A.:LIl.pPR 3)J;DW])Vۚ*ᠲXI A{hd*KB(Q.0QQւA(hiϊ&Q_|AC|>vVWʣkM%9GTݝ.OS \ڣq<$V͒D /Ҝ4а"J$g]u) Fdݮ 0$QUj>&6)iGиD:(- ane}/z%ZX'ΗC &MBvNB!'X?+s$S̀w&^~sg5W1`ck.dZ nSOd<):pylcryVnנtXǑB̪҂ÙDH-_Sq# <,bN$F@&.KA Hb'"׃e%p{oǝA 8.ϛz1 e!z%[ kЋNuD! *.h /mC`絬k@)@rc"D3`N jhcDX W"# 2vׁ+;ҡH<1_'xBB8 $@N"yR<2z'y`QmB$2$F DK@(tH3H˂GH ΀zQ;=JiO=8%7ʡGFk5~>1v=zGoNEP|L')-w/Rgr%T ˈV^S f?ywHhN)!|7ݟ/.lE7?bXZg_.AsŲ71;^8M׳d2}A}eA/wkV80{M4S,LVj}d7}6FdSǨTƨQG b:HQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuN֨c4x:ǨFnZNިNH1|F y1N1N1N1N1N1N1N1N1N1N1N1N1N1N1N1N1N1N1N1N1N1N1Ωu0 =2d?7dVŨS7dhauQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQd:wKj^/g?Nf{Z~RηѬ\FƯ1ߨf>u:kh6WߎoΆ<r~9\5[lMOZTC n^ /,g?DOhqz%Vn,y9u B; 7oUGHqD4e|cz0t4n_߽}}nsT,XȱZS'2\]C*=p% hyArI_ʃw'`t"=ɸy-{% Wˎkfxa:Ϯ%^tI:Z ϴr$#XV P~:Ux0M/~mFjBӗQx<9ӏ_xu[rNA[^k/2#Ѽ:>﹏>o4P'o''A }Jq)$jWlj[Gv&j"9#>-o}\Z[!T4KBzS3\OJvqfzRBB8jd3@!YORK2n-1= (:"oӑ=+>:/@ :Uk8oT?AշkVOjT^%A <_W'Ew}HFxjxw_hZ!XzA,J(LhAEOr=+@۴ lܐ'۳306L_ހ_\x&o 6_>ҋP{3mrA`ݠ_4X{U E_BH = l^y͟j 6.ҘLrtmOrՑ`֑Snd>ca[L܎}p1tz~urT`47 LEwEncύm}"sc'64%ensc[,Ds\SYO U(a 9簶F=amٻ5/% 喩$l;%}]ڲy- 4Zۗk .UMI].''@+Q'u}=ozGo١}JWIW2}IWyatQZ]&]  K{ [VȇIlwC+qv(:t%HW]=>:ӣtLޤ {Z+Tz(6t+fG*p-KbfOW0%]}[jQ};7*U'2ʥy%]}?Jhp,fYAyLko.?|tn]λrR-Yx`)F`$ .G?/5&C7J6jLC ͝/Fڕw|b].nIAVѡ:*F/޸R*i|s Wɏ@sۯY޻30`p$W>375TU#3'<h<m.5WiV}-?:%T]Tpon\0ͳFx>4TMG9o><3!j*#uVWZ8{%y<̖(.sVOwdZMڙ>͸3K&IV۴-wv GKwvu-j` 4wJjX%8RTdL8*bkaUKJ%hW}I9pVUźbZWЭ"UYu"wh[Pj?x n*vnmfnOvmZ|~ZݜHYrw{seVKoA@򭸂yLM.i1oO{EKl!ҙj%[-6y@i!p)#RNHU;^b61 H#f [M6hhyNXM .{~s9@Ɉ.s5lsՑ-/oTҜӳV!ڄY>T6,&/Jl8mRM<wuW.GUp]o}C/nY'wQ =zVՒKq4W,7BB~}͔־a8nJB)9,4ʍצ+|Uu y}tRw$Ft>'d?2Weׄ분S, =>/[OO?*66TLY9OXX ENr:\qpE\et<ۦ"GEkPBm7XIٿ)0ۏ1tT Hx {0\=گ47l՛?wZ+5Mz:aZksxw]{XPٺl}.wZ|'),7WbH{Vw8{I >{`:K|iMmX %d e0"Nyv/+77Nik7nÊ0U]3dsY}<}Lnb&pj2ՠtՆɹ_.Bʈՠjk0N_|wt4mѺ?VcԶvV?[a*ڒvJaVt=͟5e}ǫ='m]]4LvGzb;d|q:v3!3ӷ T-V)> ݵz*cyuz~YG!FtN{z)WNuvj"j9oZ{f6RAU72 \,]kF7۷7wFqώ85{X(\~WۤU\Kx8TcF3T{Rވd}m A.x^k1̂0LEm>Z^ P5ng 9xBX qrQtc|f׈>_s'PƄp0$Nd Fh)đO9 D/'5t>γ]_D3)͙['HB}$)0!1dB\܂FuY'ᗺwWvW=KYo+ıRM;j*.=nRb2`%Kz.Kz,STRqrsf1'83B$,Xĝ5m v2D1@C|b3783 Cn~vwg *̙ jl"_/(_)O/*BˠΎ= AcX(xɧ*Fj&&8a.x!<pF%R 6,@dG?:ѳ?4ӁFDOdOï9=YcHPFԊj3I:XtD&HBV  9Nv9} m{6zm?qxj0e rol6|$O_פV)λKCboE*$ JAĭA,!^x_R&wzWsGٖ9m-V\p;jU٢[.c&Yp>@1s2&I5A%@XDc^S;e̲_%!N dJ8@YϢg {I#^\ NXSx^c|hTj",(NSv$<(bQ{prŬ"2NcIˋ]kj J1*JaM1ˉ*} v+exm_O x97[kb׮ϼo8XT; uԮ$;e8ۂ-D }_BPLc{x/~vw7֢V@_~W8ϔ-r3k|kzWmL C޿3N2B H(Frt i [!nJxTw7ɿrS־i/hzo1 !D\x"F !&wݒK[-#]1v~6g|.^jo9bQQ!.(GNp$d(32o\pq[E}5{gnv e,9)XB"q,sXAk.)2{PKxT^FWt8$]銇J"$$ {+!s˨4[\Rjy_2=* !1 yD3'LUjE-U>YTH0A;+c+6p pf"maL&P&Q("rϦr_\(*( (%UIcBPh"*g|1G}i&O01);ѨG[E}UN6V8]67~3{^-* %_åZEUˈzst{]}g9?}}߷v(n"C>9Oq^HY2R(/F $9 (8H\1plGLE}xBc$&ؘ \GNCF#C`Jyn,0K[-#]E6'毟n(5+ %|Z"e4NrG2RCvψƦ rTTl&%` CTbS(`.Y}#|tN7!c̘[b<%Upo^卅k`'`;\+!TJv26[#*іELsQ,|NbD쫼ѨG[E}UX~*JJ5Uqi%zu-c:r,?)}`8DDaHna>pIz Y95s0p]舋;)? ayHHD"BaME;PDWzTKXzsvr%ӏO#tYfFtHS.lد2#+6:dyPv?a-qoA!KނiKIy_"e4Lv0SDW:s)ucE)"ʢr˟4zԇu>t}A 0 P8`ő#y݁v'BU mm+Xy_|j=f{5bvÌ6Χ7a$燬?SbZ6]Bƻ4(]8ٽ3$![78H48w;?A/Ϝo2pY;NH3#Y5^9-ZF3HhRE=1DWn*d[xbxwL k T )2r{4ؽ4 xbx%mxnZ^lEVe(8۸CxA򌅜#N).,plb 9R2޸tkʬE( !}BQDLjPT= ,qfq> ǜX=L-2o,G!p|dRu39R\/I R̍1&Wr;N-񮕜0az\ux*?o݄j@dt^La"Es@Eeg>lu̷2`eJ _/_E!Z19?qƐgszy[r~)@fS㉗4Z,Ẻ3in@[OuS[IZ~&,Ϸ:MN=nӍcR`#MZ\5ƙfk7:`}=NI$ـp?*0p=YSZ%^Esp >{0hbiqmW2eJPxzY 7a{HH\j4sI/8r.}1ib>5C bdUcD4LYd0}.LZ~.lJ[- ^Rg |;IF,)F,/nyY7D'N)(=LEb ؔ:EᮔY}׎bhJt4%-?#ֲ-S٣-A|;!B.8ٟ5Bn /Ba=t$ _,C _q|\O&U |Y/y,Bvm K)Ak p Væ5&(E٦0ixz hfvwx!zI-N k e].>~ZI9q#IE3.%`vsԷZ#s8qƔ|;'I-7G)Ѻ㠂$8P' )JAT=9 2-?F\4\Ƹ2k:jk:a 8z8[̫@8ߚ0288eX3F,hz2sA^(_)3:=܀1Ls'hKrmr^%U1 9ad{,ee?}lLr<܆%4).3ԁ#g/$iPEy1&5Ezo# <[sxr @(ltMR4tu9B`4I%3`;#KQǛ7>>Є2Z1`js'R{jKT[ǛmV~Y~{LK17#xq8툁<ӎRs;p 0Ӥ\H<.zД̄9g*3N"xۭ yvfВU ZI-s:d:HqqgYs EM!$ [#LyD[iNl$!D [ =N J\%"$h&"$-? ޞʭ9/;h;s7*ߣdʴ!j޾>IKA1LO YR~%-?a:-S qV Vӂ1 wNQ8NrOps2LNJ:u/ -i@wm]}*nՖlOBܖT'+#8'pUAg V^L 褼/<`+ ep~mT%@.$ϣTy1J%eDg<wqsxUTx@QҒU &"5I%*;&)]W09$X NAͫ7bՅ!Vqh&pEd6TDUA:h%x߆'{o3z4Pg<MQӗLacj\=| 69oyoOIj<ɔ(ͽ+O (bFJKWM5N7V҆ +r0Pxy>  x 2zv_ŷפGW4#nuWV0hęR0NFsdjoIѬoIZ~.xSYHKqtiB;i|iwdyL%E^ռ #/tPe=FX&gq68'~uvP+WΖ9㥧%-z%uPq7uuVISc]S؞O:\G5TWǛ;ց#2*T TJo Kƽ<_%=$rSlǔ a;渗Rt>lF(w1ئc#5zb5~Eգ&R2 R W-z78v$O$y3I>dP緽g\zM+l>}=IV؀ffoDߤVnơ,"ݿx30x3êx~onF82nh; '24,*V޴o_6'$-?C=e.{J֑-kxI ̌L'i(B'YLTnfXh$ԗ%/{t(IP( إo{mpM 0\0?yNm݁#]nx~~䉅~ a:ϣD?@1>5R cK[PZu+-㼺]wfo_γVtP_}5j ҳ83ɑNPɩc}k@XY{][w|w>S:Q1O * o%;L2o[ݱApH hYlWH96h-yGQcI\+#LA/<΂ %5Dqt|!2@=pftjv{L01ZzJ 3kv`+ lpl9sw?z|ut,ד܂L.ZJvd#Ztqtu $ gڸRrѢk+_{୐M!5,c'+3>O+[Ge ص%T"lA i@mcʡZo`EcNtoGlP]c?WRo|&]%)/X-qTӉ$G26/C`q EE:lpPQV_ڐI~j}kPvdQb@qº<]&y%,wF"W/\a[H nP'-/L4ؑUaͼjGu{ #p˩] qm_=fV0XkcBBR0' hp wKV2+ ԉe$*͈ig4C\u;З|Q=zZ1\2uVs";dnr%=̕h2+sR؈7 䟖B[`yS[Z]0])M ZJ8C,#9=#>wzDޭ{x*r!=ؗ|2$̞#~KNR# jG-̗2J8-a|,hqĠНTXyr/z.^ߥ,i>UK ]{tHaY># ~[i5}Jب ,v/XQ•d^Ja:bt~sw掻&>bC8XqEo@/Q#c[`(uwy.q$2䨮Eq~bg&UNdz\WJInhum}"P%lioK)&ͩ(,Y#@tvT1-XwJuY _ ["$%ܼ">FFoK=8[@MUR -'߳ƣ*e 9):p|J*=}NCY ɕ)1b,6&}(6ѽ( uhwC^=yCR_ibW5g:Ǵ>ů8^j&ɒqQ:l{WONɲ j]o,8BU4"洞}o;K>4 kz$,HFbd).&^X|anǿ~>E5ޞU!DG`|wV=N&cb zL|(&IsF8=h1NܤXA, :(@[Pf,>098LwtךZzˑ/ X9T *p6zԍc`w/wJ[,dS`t'!maQ͉󨦺Ҥ1o/F.]Bq?":S BndB5e)`0ao=f}ˆ8Ɖ'H`m@wi2)\<ұ10qK4% Iক#y@ıBH)[9}d]ԑ <:bP4DR)i ĸcfcm%@~\L Ϫ&P*gD½Yp@YT%@NmrʠiQh]*vHpA?4=%$du",PapEAOwߕ;`T-_/(l~į$cº.]C~/P4܅Yʇ*絯ioUE!qO :t\c9gN'߂Z3Wst.y~ܭ!mGD"*S6.;_<EpĞ'pgU8|oGiuQUXVB!/糖_zzro?]Oi,4*a˽iЁ|xz5T)`Iəv*.JJFt(SY"4XѠ im}Vv"22>IJg=R7)e$WxK[|F>ZKXGb^USw(Ew@B*M^K|ٿ̦~|!udL<ꪺmtվFJkK=xuoEDL7z[7_!2i_  fN0H&eMX4so5)$ebU]U](ȹ*NX Wy(.sWx[~Nu7%%miW _1۲jln@?F16{,LSuB@0~u[iQ[imsf<'278#L_0ࡈ/x1&̅E^A%Q`'3Q1Ю*+R QҨ)0>Y [dQC>%PUhiAߝ)~6#'aQ㪬MNRA* vt:<M.BeԼRF[i%*[V1Ch"=v'@RBA<ґT{[Ƽ)xÊ6[Pă*l%4\\*lDčx|zA|h+6Ǡad*{s1#ᰅ E<Sm'.̴?5n7}Եɍbrɝ̳/ǪQXr7`Gk lbgiŸ38tD1=.F5aIA K6٩qCZGHu𡆨:\*,:e?z0#nxУ|SL_tcWCMJ]PIaCSRRpVɴ[`,ɴbXm4WFg,ĈX HrNaӡ%@)Ú _ob5"4@ԩT(t15aO\Y'K61eDTE,r⊘Rzn]U9K?O7{iK9@z(b x(+ 4fDx E1gBoU7H[ rqȡ'0vJ ?In) Z<ʁ} J,ޒ3+hU6̸[Pė4Mw`t12xbU ~lZeݲZgE|.0ow@E ƞd 1ɤ }egMl0{2-XUʻ,@HRt?J?Gg*kчK\>.(LvIscUXm8R vpxso`FA<޸ axucm)Oz_17IKD<+k|FtFHji:`1/RX0I,YQh]x'H_j}lŒ E<Q,,D:NVqo*q>b;40coBb;̓aRa᠀@St{Axq =P1)1$wH0#g|ez̈\ln8KvX-;nB.; DwqKg(::[ &#LLN1im.TX5>, I6o91Rw6ۆ!:&Z1pLhFSfL0BX Zd-*[9ei3hx` 5}'Wٷl+unZVMgY }EaVw~jcٿLG3E*]͛:2v&L8g]gog+3zA⇯6c:+WӇbDBUw6g٢ jo]f:aM6rִ .6lB红mX|-f![S^PYKoɳG1!lBQ湄cH0eETЖT*vhnBK݅"b4 jXJD1Y {2}^þS洔^[^H0;,! )?}$il \Hy$6 t%jf-T(JNاR.刎DÞ"ùg=F@N\X]0 0mVbWe~~$wv.Ehx[@.\EX+$W_^3zoc/^&X¸s1` nbA-o~Yw# 1VoH7FDyړ( #D~lG\S9aTl\ASh:8 ι"uQ)ph0nQǰDH6b.HhR3V8cJ_LPũxYKݹ=z'h+(2~%z|LW ki5-|]BQ.cLȄW5I4zCO?CѠJ7GD>0b SR_ötr Okp@Q@* 0rGk  9sj0^,qp8U4/<3rG4*}+:FܕIGpˤcI1\T%nQ]vq]\P~aϊPJ@;2ɹKuIjUqȫJʋlh=GkHL) uP] h_@bߓnb0A3r¸=#fܖԾ|Č E|IX1V%>駥NSv\1\,X^܆R{dk:-s``*VIJRyeJJO֫'Ύ`[Qw!͎s=T *KWcCΊJ1yk:n\K;wuI:yq1@T$Zk 19d! BcŤR٬i(wz3 :"8s1':#+DF5(8l>Rv> x0V@QP珽bj/LWii3R:i* mwIS.DŽ$e͹EUkGᠩA;Mr;qV˼%pZܼFBnCݹMz)ۈ1ՅJ*78tG<ꔰ + Eh ,$y+”.YYÞWnB>C^v,;)RV#;hPwyh+;.8k?9QkN}\qe:akXD \ kj *.W8 #lTÝ"ʘضQN6jcR&ۢ@)FSK۾R5ED.8X,ha ]aIUJ:, ^i;EWd0FQM{F\}TS)CFYNKJ@[׾^7nahO];_dw]zt!HE.hY0! UJ?_1Ɓguy)sNiTͯ”/L"::_,gu#x6]Ns Q4-:Y<~~3>UlfoBL|`xOZoᓿU5-b[obU߉ U5췍d: ,EZk7PazIq&דV?Mڟ |0ppyӻHwT6?|݌}Z/IN˟&@Oc{UlO :ym7l ۬u : mg"ۯ7%tb7.b>U*1_mEm`jWy-NzgkݷVVyF`ԁ;RqBK}+M7fɛ+aVT3n G><{t>O|sOTV2޲(Gn[}ۿIMWy6RWKϭ>CF~؇70;εaGq+a3D/NſI̘6w ivHjѳR)Q3XrI3,' 7J9[=Z-u'aV.| [iv=Kݝaeɶl;A:DX,bp:_SqDk6."QX哒(WY4z]+tKW:.N -=~~Czt7H&hӠE Wښzh~6.bNFɺ53@xe"aIme֬tWwFϓȰrSi(h+%X-e^c?ƅPk"%BR㤂CQ o(88Km)fpk#}*f ;/U2!yȵ/8 ]M *x4{?i#~mUl ˽OHlŋ.&.';~:rg4GRl5/^]k\wDvVτUdVhXb{_tԙe_ WX;Styid'y2[|g _|Vrzj>GB֒ϳ^ݭkWuO뺆^.ଐtD*3+"i?]_̣u,l% >63geX#kC/D'Vjl ݗxwQ JbO,RCqo3v9&t|旴 _[/YA$@.O:JS QTZU*;2;|nxf! _ٸ@$[PA9 hQRfopeD+&1}FݰظD%agOq禫;8B&]VHҾ#.WTNg"NR4I2m:\1imxhijW07Pa(UW,78FJ*l j?kk"]Z>LS 2d!qcfp?1:+m^9WԣgB,q1xdRa4P .R HJAœv[C|Pӫ7mg>+@'ԢOP#Fgծԗ6k5E#Lsia (Ro\S|xN˭]!>~u1`=k&1/B{@qj#M[i$J=[TK??4ۥkDG+w v8!&43 @˴{6{m^M2V9Ĺ6f6PH9ʾQ#s]FRX 2!3V{Zu@" "gKdC|2lc ǝSn q#&(pu eOׅ k >v7XLE'~؞_GEGn'Bl:v+fk!: ΊĨ,eQFAx󃈶x8^.G|̚TYBCA5u4s, r M͆kQG`g%>ԤM5C|HGOG^d=| b1Dt Pv++q%R Xͅ&M@67Q u&>eS*nUd"Wx$u8n\kVkXs^FSk˰HuVKD/<NʩQlUhʨybJW\g*nRǻW _杹d>X Z C|$^_a[ud0t9}c &M9SCqDjN,WlHG,vϟٯVSw$.yj_Dr C,G 2^HU5rbuRrQ9aB80&H1ǁ9vb?dz5uWggظD,Br @)V( (G ړ(jf%ZS7v1.Jf"bD\Vbi^;}+K8zm=.a/V#:n"8G2RX@h- V؞Z_!Uq:!g)0ӹ^}Z~=,W]."?F# @bBp"X:(_d9-I7rQ1Fa K˾ԣoٸ<Qye(_ #DX3죱a3 "B킈:]@q*$L9lx z{='*Q`)T@x5P!70mA$L2 c2ۥkf!>$]Ɨ#I4BG?OIR9g?IGъzjiܧlO+c" &br@]ϲfM=!Q?A +4K7z.HՕ '7-r1cɈvFDCp{9&J (a"bwL3:ٗ0*.fqYۋ9ć$y1\\yj$YOpѭ!ft HZ%Tb+M5zjǗ`#bW2M #b8R#EDZg\Аn<\Zq %:Vi8Oxu !jkBSa1>IĆ.9 tP c39-r a 8x`OqU`CxÛ̾ǯR7MM9o1ҊMFHRT:-qD28|r!>}2ሎ,rBLPZI [͞ܛ?oJnr]GZ3:GY٠ZΛ/ wF*sHQ50g1 Q9%"!VD+f0:o2Nbz(epS5ߪhD F6Rv&]dMWmkS{%oz,6wLeX_&R|-MVd!ڐA?Qxq8ʼnF74ǫ*ǭѬiS܌on<3_fܡ<>VőozgOO-\7~abFgձټ~CzS%Λ-͘uY6BNN~ ? .? ? 骦2#Y>t'r~˕B^39?%on7,ḠY | P=80@Eo- G{9 D e)W9,=ыpAF;@Cyґq]sDReg~+.[ .HCq'gV7IGtnfZ}%6|Jyޒirlgtyeq#|v>%R)ht|:zd2_{7g&EѬm'ߟo |bCHK,4@PbVk# 0|j0. fuy+>9ۋ8;^>LYq׍+;~ڙP?ke7U 4nK*iJJsqԒ,=)7dWA3wIEa>_oRHq0GɩKutz7̛bribݔbdhC7؛ۆε^CdJO1*:4l 6Q.LdmXNHumUn~[Kq:xӭ=S'bֶ<6`>Uf :|<0@Y˛"\]4Y&8_3Y&^=wJc[z6@y7(8Oyr 8WrpkX~׋gUߩn~$š wnX`P#`(^J}7,rWkyjvӴ(W_睊o5˲"&_@hlS5j [L-&=X>ʋ_.~\CRZɴ4#"`œV9ca5b/gj#es;M~}|yڪ}a:6*/-A6<D~(R`j\ʮ^,JLOJi\FnfX%{šq?|lҞS04KGVu>t@ "ΈRR^ w\7xINGDDlΧET?۠yW?foO37/ʉ¦&/~>)#]2^/ ݇d鴦/>|'ڴWl5/^]'H>xhƽEkC t&p)yKC:_K-Άpa ipoR8+Gox=^MMa+[PԘSg}dy9Fg;u@?tR97Œ>E`bgʞMB̋Q3uv7G8iHYBCּA3/ çg@׳e/4"T@p "|o3 cO~Ykϖ!I,_%b_ږ˯B2,'A"Ŕȣ^wI-HCSZ/䱆G}› 8ҝxy~EJo-Y ZأrPAr K-G4_-rpP< i1Am;t.ϺC ߴaIElENU3b-;m6bpW 7A1Vo0^e) >)"![OJZ`l^ `e8eM-|ȫ\E._kKrk1CxG4ͯ%$!0(BEJpC怑˖Xq):Bx&ǀnG[h@~['{ K-=O86h]+4x5 ~Y{|> zi)7Tkg{̷ĭV]8;Q@cgƬŜyfg-eXXhAx;ZuSI<;bcqPtLR& }z(ş/e/Os:4Ymb!uNwќ" L*A$(Y_H8^_J]k_Ϫ.[#ٿ^Ùhխ\LΞfпS̝g}?LE{?ۇw ǙON\ε[;7bb{qu ;+V~􊿀tT'>GoN>WD qkcciIO89aPؠ}7iX[cMd5s*sw-);\+{#^i${x| )=nKMJ9bYF(tz|J&>䞦0=5 rr}|`| IgvN㏆ K n( uד x:Yx( ~(x{=[-A :ᣁY8"A}X.?F[5Ebi Qo܄Eh;?3 MwJ`;1? J?O$|"6>$ C58a=msyaO M=%A_4>vj )wiǑrk0+%?+is ? EQcGRa?TֻRSAt =tY:m18jL .y-̪\gt{myАd~Vu 2ƫ,eZ"%U$dIRI ڛV#Q6Hp;DyYH d۫\ "ty U6`DOdSaUl6;unӌ=<>1\l_4k3GqaV{)>!T#:^DYY>ukG?0$ ~(Go}sķʏtm¢ pۃ<Ӡ0DS qϞX@̼\*ܨGD^z Q4j:,8?9|Nub9x(&JDj\(@p8L A]>f@HhZ&SB`I3g1Cٮ\@G@$1EI;*SlBXgUT v6%GW?i:D#۫Df̯/Հ[R8;*%lڃR$ ?+'=L[&X^`Pu5x\FC'j>!pХ`y%^YbS_OS_x1-9 Q-]g5Ezr/7=lGð]JhHLp$O5"0&J|;ŀAIPV6Ŷ^@<]vpK6F.`Hεy1Ԓ28pp.Ai CY£urI:'hyT ?c>UAձm.ȰyK)kG5%08iVZ,-Pv$k> ZuI!kc}.6gJpzl}Lأ}-MM5jeq5ԆL>RK#w׋|Hy 9]dDp*Ne1 םdxN<1mT {x {L-5MR#ҡas.Mp`gE y[qQp? Ч5kuCU!~2GXj׿&(*$G"lfw~j28]צWa=e Mg=c%Aoa}N|Q.b{Jru'ydT/|pl"pVgmH .m.bċGSb, I ĪJ$:{3D*T]/m\\]X^'%V!==ʡMQ%l0!(G>q8$'1sDcv]#QCӋ/=As֥HW 5]@_fKyPt;IEoo?0_x۳]FLygcEęRQ e>Eۂc`1/j3t w0FyL1kނcP}Wz@&tJƶ~;-SK'$j*9+KNK~E\n 2dhM|'tӞŖ '_eAx4xЂMs~7q߸Öykx[RîtS  <.q֮L2E kX]~ULw; ,ph0y*Xa|q9y>)7z>]- ؂pw0޿}m 7`ȧ|W@hN yJ~ --ż4%7 Oo?Ĩe~ߺ] '[p| 6G;agH|w-0LtfFE Mj-#'&z?fM瞫 h3Zgj@J v,ˌ*oxoЎ*q8[~%Hg˅M%m{ XmV*Z +A[Gm+/>n)z7C 33SHd|!-|a_LTNԥ| !i c]7/qF h󋱏7Ez6o,7X$>l#I0UhDX4H!=I, &$QSTpA֤߷R'*U`3hL He}V!PvLIEN n/*l[8y }?^7p0eɓ0\+wQ ⏭A ˽#5qYGG8qUGT 8w&Pp4' c)g69ٜV6zͥRLR":HEL+ҝ)=L5S֊(:l宆*N;M֏(WF:5թQ:PgEAL"{,LBB2xbyp!8D*LƘbU":HCEFDpTC'%lt,rARʄPDE"_:֝a-ތQތQu{3a7d"̑Z &D GBR@D( qnW4jPDIǡH,t(. án5qCV+yPgU2BC#:n=!T"DW;C׹OO ByWGxÃCkkGf|ܲx8Wv9x+Gn*ypNb~ V}nq^ݹߗ{nȀ+3QbZ~&yɋ48r—EɵVbΩjom._I\SEpa+/OcC߼X&/[MOVKJ~ǸC }eso7܂rͩ\yiY\4 D;efw LrxGtv)nY69@(ÄwblhB|;9˴,Dl.77iA=YkDvtT͆Գ7Vpd?ځ~D`:!QkkZیxj) [U/RWht$ N7~ uZ+E-!˻'|3LpgST&j%~ɼJEV&$%LR5Pye-% hywډ@Ľ4%wn3Q#Pn?qwsyoѼ=DBH=GEG?Zal&!?lz/;l0$cqk&_sؼwH2Ї`lX =7q#쒾kCdݿw[z.?oTK2I;.7 = oa6/õڻ߬Ih~۾oPp2\kww{|Fe-4 ƥ$TJoYTڝ}w2A:-%=OY/fٷt5Wvs`vK}Me󿵌ן[xw2/wYmdfm?c%Ӻ߸ܰF δFzv58z3oH=~I?SX!bzÍZe6d01ZݬJ(k )izFgtD=C\lj@y=SXgd`U^0\jb\} +FStij1WycU^%_hRϭi}+Df"|ή?{&g9$دNJ&Nç;;;|hf Em⍍ qA9r;$IN\t=@A&/C97 ryo :C~5g;ʣ1sG1! ^ PF|dև/PwIE2Ľ)5kJ_RW$HBC]XC# GDΗ@AW&b V3\ȼ+|5EkOp4%8xld=M ܴUU0܁ u!B)x;/"퉈7 3V| *K\{.hI\/Pxk?ERL!>Ӓ Υ;=} kS= cn6P#*%jo8N RpbbqW@.GTUҘ T ARW uR#J͵WQf\8QcJ ~^J|r> de! ^^ pS ^cEzK'.Px%ixZUHSae4l)@A<}}:6 Mo6r*#BM@J$CC 1vi<` `&B ⍬~g3L$KR$R.l\imz|]oGW>iCܗ,S$ád+!)>ű8DB3bwu=~U]ݍR"hJ+t^| ] ^A=ն8@paLdľ .]^0#UQW`Ũu[PbѮ}`As&DOd`D\O[bB46Ԯ00hx zN ,)L*R(lC+y34eszN|xJ6E*PdO )^8@`H !Z`,V9PbFu^ޫH 4[Jl\aFiuPbV|( <|P^ASI*Iu@T׆BwwF*r֚M62"W@N  ^Q)vu˨AXL&-IHK։xHPbLꮱ}6Y3ٕXd͈uFwbSz@PbɮXDAK(`qɂ6r| ] ^tU5H!J6$@o QeƊ@T׆BSڵ+k͔oI`^H,C9TۆB'6c =ݣ]+Ne(vWU =dQ@ +'cZҝo}(:!s֧svל7ۧ>xstlޠ곥¹n:Oz\΅s~vUki]*?ށJ]#:(3ObÖ -+Ge)*܈4<Ω9p}F$ e7!2+J:㞦椹J>i+,(&beqBLSs;-( ӨHT n%*F\1iQW TLqSdɋ)fcZqTQsǤ9QC.(dR\1Mn'-Oi'I | T(fA+(QDܦU2"Saq(iLKQN3mgW06 "0fc΄2w)0΁Vhu(%G'LR:8`A[&ԀNK6*q$8q)ҊR 2iP56SRca좵P)bcCzcH@SӋ;N$7  -Pd$,0JY%X.rtf,08[JTX͠ߊT.kd&M7`UVZ/9}Ĥwp3JQL*$UQu 3.r ")7X`c^;gφfέGA됩sk bΧvoa()2 qt=р -vhr4 ”P8d4LMd9'\1R^K(L =%^h{C|B;Z Pf-CJFxtGR\8 ɥ<U,wb-K8ҬDW7U <i,z(Ѡ熫u0%E # 'u FzwO@$C3AK6Is!fTH3,xrV 0H[σסG`|:$+RffFa80XYz, ;]Fn2iq 7Ӂ>lǜpQi<*Z,7I#Rkp[gf$Y7|D<3]YRY/_~/xe~r3H?M | {JVٟ&oӔmu+.8̟-gj~e ?~i_ྗ`3Op;Gs@/0f8nj=턲>,~I˯fw͞힯'<ltQU`݌D\.{-"kZ%&CfGLԦAҽ^-7˫U4~wxֽ5U&)^*+})b\LL(p".h)U3Zp3<KИ/SwY97=1#2u[^uÍzgjf4_ѻm4??y}=Xt)oF[k|K"Q\5Ͷ["Դ";`Mf4ڷؼ*Ǹ`/mޟ!1oaةx3﫯Ae{{{Ɩ?մZߙF}tk)||˻ͤC #`6G3,?l{\tH/ז??ꁼ_(;Ҿ1HeA ٤`ܼtF ,iO\bhfj `Pzx- "heﳢrQ+K&MF6ڻ<݂7#[ZPKK.%ǻc@Z투,oXݏ21%5 RFfF'{אݩIr#Lǟ^\ %dbxXD2NH(&DH ogb2~{@ 4{8/؛;kc"foE[眫9יGT 6xcL;BzbzWBfȱùZxPBwژx<+{>ߝE=p^yDiADs m~8n~ /hhRd9.Ejv"H!J6pufo/"s$L&bE0 [wt;sTT cys>d3S&X9ic* /dfx9nIr$hb4:@&/-D[l<:S`9SL*Vݗ#?qYTk[\ftwkhMsW{'=j,kV<ޑG00iV,#x.ǏIaQu ^q 7kВ@K2dbƇתZP^i<`1~zW܏CdzXsNJ})QcpqH4_4=\4vH|isQ-IuJ|v'wTjumӱxRx 8|1`,Wf. ܘ,& ?} :[[K%(0}Y?F!^9[\0~uL>/>o׳%9.cS `ᕸ*N٢èkpY~ bzs֛7/YA6s?|[yMV-hq8G,ծr^;4g|ւ{X)CgؒuǴGz.C*5l\޷dG޶ :=oM^KJV8[#n6[<]IS>T pz>`LR.Z}ݖ=HJءy'Ҽ8nQѵAJZd.AԾ o^S5b2/!a@N{ok}?V!b߿삭$뒖⥝@q.>lBT<|E=d}9ΥHFT ޠ i^⴦(]\/%K/A7mKAc& 5$JR$FޫlSbR(.gyQ |Plxitw皡ܮf0CW|39 A(czfl 8 rQr8'ɹ{ZDS*0K4:W m͍:7{חz/\ȝRu*ڥGK5J [{݃%MwhG$ h<tSgJ4R(ךZ VT:_KKTE~&r/g-Ӛ `1$1;c:OZ#Q!R봫o^C4|Y]<߮J+:2v R6:vl*16EY 7o'C$VbI1BAP@K@۝4+:ixs;UBI B㫮u5EYമw%;ce(6 "iɟBV 99N־oTcmZp\i/Aw^%\U !9_:$F(V9GdX 1蔸U1H"%Dp kQjh|6/,j7&.!V *}ET;- ʔpE"F] L IFem {pRy+1 RqQpFL8M\p9_FI{+f>/ *UN;$5)vyms6"jUuGgVzo/yه>|T$>;{$$SVD2Vk9EmXk#< 3z: ϚI~fe.< =Ss\#ϒ/l51oL1;9Q0>OG *܆ #΀Pc)K@|$cqk$ PNmSsUqK4rbĽ1z5+0,fQVu$šÝ-mڧ؊Ipri,52HL|-`k*٧|UzEP-#)=:ɓ;l+XlZ\h 7W7~ge#zE*W1ZrFQ>F̤9J~բb_2`jr ӽ%{޻֝gB0>E#i8m|W&M8Jn%ܗYq4t#@>}Na]gASN Rr=Be2j$FY$:` 6 ̗hn焖E4}r`sKKT6bʃ!G)̙M.ou(A%əSch s$F eqf}A"R{$ow2(0`hI"G VVVթO:EǎEmopDZ$qNTC Pa~0-Z`‘S,9*4^ZB68(9E}\Ye3% YV Kha0I2iespEL6&xQ&>eڑg{?I&0R{oth~2['ʺKufi,jvUbv*04?n WV)}h9NǍRӪRQT*Oze:pӧA4 pBBRi,2 YbVzVAډ*eeKAzIs}@.&<1yTpVw8ˬX ;L0~z9ix!Mצ.Wh/=|&gA+eԒ30ǮݴGNY Jߊ6Qq8:8)p6]eo{J(8Y&_HlJJS% [`hWTL{EIF&eMQ$G@36uU^6J uvz8>p{?h lH_:-ۜ*Z xGG{ sUAaGFC0u.s HT9dFX,ZXH$.':Tqc-AH#(9Fa[l8x#5r`la>[!_ ۀmg/ ǻu ALūG«#fx0}!+ +hkO^äif[97GR(lxpd)Kg52r8j@5w}| Ӳꜹaüޠ&FǾЧNRjg#:h2`d)/-s~ͿHM'?u?K W'Єiؿny;~oCM&|CgL .Ǹ%QݻǝSڼS)K*@fzRR0#E(՜݊z/t|;Snq+S/5zt,&aSWش}m3gQrڊZN]}0*_XR^}w6.'ϤU(^KM=]$dy9%L(FGτV()5έ!9x) L24Zj~tK{}܇>CB#AL>@7o@LbZQ{^<Ɂ8F.$$26iay$1~VYO-%&Pl<%!rq*j g,+lLF fZ#XТ`搑ipDHӮکϴSWHcbvUZ-XkNty'.jA6:v-bƪ S1=i R9"bCU߱]8'9Μ^;q;C qZ! Ȋii|^}`Z!G¦==6Zld8,<9@(Ek18N :gЄC0* <rT{ϺnzQDR"L@Acq>z@Ņ;'9J),A+CH`d|I3,HM+'҉+6, e;z[*sT?7zQ2E۷hr1.w lzj&gs[R*&J!$p$ R!^dn1蔸U1H"%Dp@~](_;+<:G'/iݎ>$` Df*dsn43sl4HJ\̚??AI%[[Zjԧ%D=zV1`zla1*j\hBD <^8_ KE$ZĀǁ^6ؼJ`tLEBKk-;!"C"1&6jCX 1hj6/h& TG fVgYV+!b-|lsiiga{p[+ɾIȪb,'|gő.[(Ice *wZ=m.+~ㅨʦ>Oz*[]~]7vw;pxK]_m?v:"0:|-DebPb7yoUm~6;\n;J_CO*-J0ɎbTPϲR4|*_YjǛ_E2gvuDrVS_.$}ʼn9hخ~}_V`ŴͿ;<k럙-`W rAմLh~N>ǟ^~`7U>?97x74m8 9U~mUԫb2Ѐa3h5Q|xxFc2"m,-gUb='md|HJyQ#(FʋKۻ^pEZܔ>SY&*֚)Г[lFní$,s \Un Jǻ,z| pY9 9-&lMo]nBi + n'h4i=-#m~3B,l|\poN.tqpG€R(eor3kܲ=(жDb@*4HIxE#F,l׬Ci3mOD6Ajdh2Vh EDi4 Cj̜79&ލ.Dai?o0z^OY&eWorps8=Kp *}%[U(La|Ih(R*$bL*MwvNt!JHL3V$4οeR*bR~zƪ}C[3`g4;l+Bۢ+}x!ʈBT?o&i%8gsiNsK~j^#ydRXrByxIU^W34!Z:ɀNHѧ<^I|: !S:9 A&s0C. 3j:g4= i ;"b|1%誚k幄j?Vp~•3RWtRLfÿ?lrnSB@qO`ɋZNbKV舔%ex(g:k.' ~B2cV"Ym*43?I:c*`Pr0K_prp3͉ǂk]V&T!1VjA #z.r?@Xv)d( m66SE6C$ _vB߅ UPTG&w*ѕ$w" .aA T12n:'u BiE+\<Rz% C)jM#,iZQvG)RJTcP^CDG:"Yiڱ =.DE(l㴷&%IX{6!rb1ԯVOX_ƻ!\74eUt.uQ)卅hPI6Q G[SN<-dTʢGdbJx)Oo#_J2e7Vi$FCZRB4n2{WpWa~߱"J\2OȾXF7W+ X /%3X4?| Tw9y?9Oٳ TGǩRxiTެİrȈ-\rRa&W  % P8Hl{Oն^htD:bOwv}fŖw C}4("iyqt&:FUmBd'[cIpt-x̋ &-=\haŠ ):#)/f0LEBUt BT Ĭ#ۗxmj˯X 1$y G-}A+5YP:R0GiܧUJH5۶D={>==lNr/Uݏ5;sAŭ~8I Ω*Us'5Z{ʞ.sRr.ӷLu}O>]罓9#[.VmQ#ȱ\:>u)30{ Ӯ'dQ3KȜ_l;:Bs9 |\poŀ=8n J,6qr\mK &-&+1g_9Sgb6GZMD`s$pKOje,`pL59 I5ކjfm(w6%l?<,`x1c-<=6Yq5՛zh>axt .DϾd E s0H M.#Ę $Uy8 ]5yoYb=l+y.Л%# ^}UY3FF3= \MN6rWpC곜Bo^ސKAEK)z TUY:20jO|PuS#Q/@fI8x]}}8w? "1D9@08}()nQ)4<X?I MT"2 ,!Y!m^ c-V7#bz$QZ6%(jWKeI>I=#q1#GFLKflڦq>,&?^Fs]*KY'[ա H|/iD{qr΃W[U%^u? ?H;r& BYe?N&YV&L\3q>,$v<ΏGrY݈P?\gj@MPYB ?DL\Tu}Qy''٧i9sx6N iUi5Pf.E(x!hՅzIl3MV^>o} VY O%emg'[𝙮cnTIPnk+DUZY &4/CH9*U`vq\?D('zE& XJN^[2J2cZFtRaѧB7i-nzJk!W!mk _@b&)d9ʢNhfCBԁ翦x+>뵺' 8i|>:<F=چ /1 ~/^]mn"QjњB/-KlA;I^1NJR/DƤ Jِw B"$j ?v 8G+ۻO3#^7{5bfB!~=s٨M2pue?;[q]rN1X+:<4 4gac&ݍ'P Ҁ܎m-+̲@ڦBЃC$!r"tnٹe疧-YS-"̀*#1w4DAtL2%W̌ۤ-;\o?du^,VK?s.^j^PHIɐ. ]MѬ j !b\JWNݐ Z͚4{B]׿93Tw sؔL|\_?l]ENu;s]\9˄n[:w=wZ*l3U۷TJȁg78a)vfڙ2S4(AIFA=HeF9"qX'ڜChrE'}AH)XR1$oٻFvWy3_Crv<$6 x=% OQjْd-m>G(}d]F8Am1qdmmPe{>1vݎ,LN{GwMTi;/]fLݮO]e{EkrɈyMmBγ1^wo?`hzcsѭ7H#QO>qۄ#f^_e|O$<^1:og;sc]fݪ1Zdc>9%{<\gnJ{H8W^ٰQ(7M<ͅC]J,;y)=$ԳҰpu~V>.t;AQ%y-^=eXhY} ]1 hM"`a=_J{{^[\uܠb܎| Ƅ#ڨf&md`^GǑqt);eb(N*z)zG uEu8g6N 썾?Tv|;D2Am|Ħ4Up>xEWFdT`R"Y}6_L&t{D76]-wF8> .t)^ lx_5jP{%uFk1ΨY\Æ5X\zg,%3;J6 6Ȏ⣱) qrQtck7u[ĕ !EvTqMOmN;/gEXT.`ԍ⻗KK)6ԈRQ9^ej(4K+إ,䕗C^]eך^+KwWYJzCrW x0j!r|j!+A$!eW+'YJzHo^'7~oC \H%s4MG JJ4R(ךZ Vc˷qlԥ8pO- \-#q2 񘻠 6KsQq 3&H e0"NzhC-=ei6+A[28;j)ǁ9Lzdr Er4sUCh ``w A 3'C$XP!s;m.y+qMf*3+RoV)ɹA-y&lk:[Ϭuο!l|>PIp^0IQAx" N))rN4RXb.9Vb#`(fZ*DK=;DpJa g簉B;nv d{.| -7[(RS-#D)),Eij(VZehIRBrk'uHPp=k 8`1蔸U1H"́:6*s3D)E{o?z/Pj%?Gv.l~4~>jor趙/W$IALgl9G^I%wEMP &Qn"72eWHy\'dֈs,"`k#&4wE!*^vξoQ " D \zP\y+uHF :qIe'=8Gܮ|cɢ;و$N.,PSlƻgoLw. )"9C88G߃2Lˆ1FF 6O9W0TS4Wl`NmEZgBZZ_9덇q؊Ip~i,52HLF 6y41h˙J.x&0İ 7ID\&G 1K ӲayYOy O<8Fߟzw?")KV$%D)@Vp'BP;Qo1j۳CN];[(vߜ l٬O12rx(vI"h뤍O:hzrv-v۔6fzxW))s'YNA&#څh 0Ne"Hj9VHt<i=xg ,gKc .r򂘃1.h#<*% PG'spJaœbyVzۓ \/\/Q6bx<Fb.NaK~ `$s~)KI#Gh/#"H %BJ[j a-%K3*R8,K)zT'/##Njb)Bc=c$ܽuD=8xhȆw:{¤矚8}p`M1]NP^bͧB3ALb0R7,io26YZ&/Mremo~|7pcڭ=FN[ԅS Iq"b`5u^%Pʖ+[lBٲO=%!rq*j g,AlLF fZ#XТl3Sn\k+^\RmrZI#Ν.SICFE#bRd %KJx F1[ZXvAc[V[\$.iOu9ʔR%`Tc[؂s؂w7`dzd:6?}ͮ}g9ŲH2L+3Pf*b$ē9^@/iI# b"IhPe(7yVG7Qʦtl9mŜ"쁩U5xrhyMmBγ1^wm`hzcsѭ7H#W=v}2Ó!G̼6r>VHxctN~`mM?{Y[Wc[5Wz8c_U 1dg7W~.@ V9ߵsfRNcs2^a)?\-0Mן><+3& dxWc_."RsJ? D|yB>K{S/`U S`/DRqF)B2սw"T8#4\_%guafO_j7j삻j-Q .dԛɵtҕLtl-yS"CjGGܢ\;m @kG-~Q_\h ytTP@<\&\* ԹCԁiHT9H^`+ʧLka#@7hd"~p4"8#8R;T1q [7Pb~?]+[xRȧųevS봦?mv{+Sy”iR YdWvVoe//kp*`nku_W Ʀ?Нv: ;g JXDTk9( ~:1uJ4\8WJ2&gƀN%eLD&$SO9!((n[:/gEqm^~/sr,bƁ r]"|3[$)~2 įY.}‡O!>K[q oW~N`{/򻿂k޵qcٿ2ߗ,2v&6&,Xaqikn%_T%[I]}C^ODZ%G%`jL0NhHZA"͈u98+:H`|Y'|'|АFKHP 6]|,=kWKqUܝ WM@w%\Aڠ趇&k W%C7 (?/|pA)VZN-meĩI1L|8^(W Df!WG-dǠդI8lь'YIE(Jm1[q-?s/n7z;F<6k1ޅ9-?cwS7~kK`y8c{MUz["BO]ޢgUHu!Y$"ŵ"H\}vUz.JhvHZiwj!XImė'A=\uȼM{uNC:MM nbӶ-֒1tEϜ7o"u`95j¤,NLF᧜B(NK6Hʮ$V1t3N%̀}4YOo]YF,mْlW8> GhHܠa"mf=Mditү۠KZ7Tg!΋7f;W\OIRX9P.5l KDXktv%}ÎG;nη܂>I6#(Ӄs f0UuC`"}g!nTu*n{U7TWuU`ѝ WC*w'\pUUf2O!`ir^%@7E<<=&O;$kN;7#|3^X@V!HY*;XNI *FJIXD&,UU3jp)k%Y*%6|&E}ڲ+l vdQQuoM9*^krz4Id^>^^yqۛ8?}կ^^60,ΐ|Wܥ 6T4+|"oEIdnJY2:eV\-_s$h|4!4Ěڕȱ"fӮ2*yi(+O NdՕHBΠAmsv/R9κ")T`,$c"DBF[-ڶ[TQ{,ާOvqnվwq:;6v*kwQ C6g\ĞR^%yGNaZ`sm͈n+2"uԷ/5g`fTi]sJi$dQm*6KkUBVы>_ej74#Ĺ㴈WW{(SvG],KQ `=N-T*8}ձ]UE7?Sݩjͱ}鲀shfv֧ōvv`[݈%Gws8l7r:8g,ݘm][[0։@bnݷvhk>PڹKf7e>h%H l*aˑږ [Qp4hHG'͵+|52pg5?Cwo?Y5_fe9J5q،XZ>Gt:IqZ$I;G鲌 `wvIѰ .yzcA YkS5,+ ܆D jUkgKVFa6(ad >}I0(! eZXr`e1rѪmf"M%Q\m8liF'ە|* ^ SZb*/|20N%c0B+ (ԵXS :%jA5|rttvNZ%)JAj"K e24!'aZ&8 g0V`x kƬ6U%iɂ}9(rpEK F}{e˲ ZHB†IlT֤ CdJC#t y2cUVRChȡfWsmὐmn Z#6`<}3c*[MIB V%CLȎ5lg(sa9(l^% N8Ub\Z8'$GY0ڢEuA2{7*m)oxڍqedm`s>įaJuĀ,␴Բ-1c:s'5/fF^f(QcA1(희j*!KύQ#b&jz.Aa 0fU!v* =T٥$Y" D$˄@4[ASXC E +1ص(@t-ւz`*>vN0V 4jHDrʃZ\iDcA}Py2WX"LU.!ad54x[ec]ȨZTR#.kARt0ƵYlXJMIx }V}JE. ҭJUCӘV q z-Zh*X4X4dHfTl0`A&1 aLaǸbR{HEC JKf@57.9E0f c2oGlJ)wo>8Fu0 #|`Kc21Ɇ ZaYMqŰr@f̤C@8FV0)>B Dp7GRv`/@fTAV_@d0$RRv%b"CWD udE\0q5!ix^bhb d#-dkjLa#`V.@7@VAy&BwX;`4qȀgzG6\a"HvE6cVM@r|1Fl vt8bB}/*MUUtAqYڬd:Dt9R}h4Vsc (wHv9r3 v="4D i /} m|R2]h8{GDD" QktsS<oĭ ]&,XdY_ ?^q`qf`%g^Hl҇U*}7.Lwy2|ט%}m6Fx4a= mj}/F9 HTrp]_ +5j+x  BL)ڽDXRr,am#N:G!v,0  走(T6쨘F$KF+&cyWN ЂH,ə EQ0QLF,Dir9D@?x ZA 78 8\a<ydQ>&d ;KFT2щd Fl=AKmtTB0Mh `f96ԔbmFK,t2Q. 5y-hM{-[00d Rѥaa 㠝yy%yx`oziL۶*Lm7n *n޵57鿢ͩ T$!T̾nF[3HEɔ,J YIe&#RkkYIƭ*qr!a0aab9R$YuxjaYg]-M*o!8jhBl`~daݻ! "+0e&e:Á xQ@.5.7x.n>9DkU)y%;u`BA@*`Ɓ`ؓdrz=z)+/*_<`W8U kL qrjΣ w)g=-)0©|zCX <&ɅBBE(zO3(ď| P|ƒ;1sr-#l_uTt#RB*0r}Y$|ǣ.`&p*\N)\sguѽAUXiX:_RM)@`b ,Ba `%D`a'KPhUBY<@h!m_T=B~Sw"1>D 8{詰{ZTYe<[*Q&M d΃wtT&%sM8!v.xn m)[=2(߂*ޮpD(*|TtW\\7,ZAJE'IgDAI<=!w?/ӫv bJ,kj:7;a<1>}iN/%]c_ rʮ G_ 2V?2̲oycGk=:T0'R, T\'//*blY!$."ie v=ΙПJBT6JY[F*e";ATY+sd6V1!C`/XB/ݳnU/׷x-v_OT'LI)Ic4Ic4Ic4Ic4Ic4Ic4Ic4Ic4Ic4Ic4Ic4Ic4Ic4Ic4Ic4Ic4Ic4Ic4Ic4Ic4Ic4Ic4Icۭ@IKIt_yu0[ϾNfiI:i]@:1i=}J#;D  u;P* >r/D[pKwm6qq]702"}+sO>a~+Ts_E^Z#!ŒbOmsֱEA¥|^/We"e"6gt+)"~^Cq*.NkYԗ?\ ]"SAheh> !(Jiha*ŕ|L$6VJJB&ga%^w^2'haJםzܺi݃Mt/Zu9qeqî U^L]w7b8f魝ʽL,Noz uRXl,5EY,NbQRHd@ d@ d@ d@ d@ d@ d@ d@ d@ d@ d@ d@ d@ d@ d@ d@ d@ d@ d@ d@ d@ d@ d@baR|I,0^Eny1,3Rv,0ˢ@bax"bqei ^rT IY|r+rT GrT)GLC;OW!T/W,%bUT"H]X)HFXޞm:=9*5Sɰ.|)qñ ~2 =nx2_ZG}] I Y1S|f xCOzLX%mީ's\ّMqӨL>́?Mws@ywf#?JFv#O^/w,P/OyS|)& Gxr#'jGN^ri9 fV>\w<4xjP8Oy3d8}zy|!'6Jp4|MN,1+H-(k&^p\MfxN<~ h>.LV-n8^N6$ܤfWOcwS chGR^tllE klqʠ,{}mɪʼ."ޒqȲ`LQP+’DEJh4TT e13vLIfJ蠂PFSP\b+/+*3P\ Ë 6DFm:}j b.+0zQ{Hfs`_:V>bo  &q%&q$ę'62oIܑ2Ľ\\#, 6&q$a8lMI6&q$a8lMI6&q$a8lMI6&q$a8lMI6&q$a8lMIh ^PkqJ5mՃ뛄}a̻u/Ϩ\-%I/LWiU{,ƞvodگ|<|2+7GdJN&$7ᳺP/+-H,DEv1b INԱ2GjgXG7" ;X"ղ>%t@Q -M=)uPAkȝ "D*`JrcJͫh. z]Ckz}6 Qٷ8,T z5`mXw˓~7ǟǓo24|yOKͦ} tJ~ }ULLknU/E }˻/{{xQ?།3A0/LF)MǙPH=IBN ddrq)!)TK'U:H+/PF ^ew!{=Eų)UHm:B ~J?{x$4O#6F.B'@1O1Few߂i׉[&c2WS?q\hU UD[2u '3* FgEPgB%VʔUZC#Th]O<~dOmT?,vxw>eqWv>]F3{(CGGFe@!)b,VpMWx,R!%Ϫ9%2jp8l2YT/D1EQ+0t[:ǽ3{L\@{]>=x@/ x4?G"P"d1,y,Z9RȈ4 ,$U~-e)t2/q/Ã\ oK^AqA.Fٿ])BA*SHݗ\^ElXp;"~<19LBԘŇD6`6GX 6Ic#cB0̚Z@;cjIVx3/YvrMT]FOwD2#Λ6^~1nS2y.\x{ځRd݃~1/GrZIw&1jF**-/xU"M\SŔ <•ġעVڬ1GW1t[($0d<̶P-<-\(=K:L=:㎄ls$V^ $6bgr)y%`)4Q,Y4OcE]A$FKS1a0J^9bÖ&LC*p`@!>)D P(!|`]djFWQ[C_":B5 5 5혚vvXkɂ^QEN8UӲpVÜ&l?C;L0v1 tͥ0 v i"Cdh/R/'cMI i@U-0?̮>k?5BYM%hI'JjjDpDHu%1H XRm .~!{?Xz{|zyg]%2KUc?5ε*DZ# %015Qzj}{vBV;/6{hk!>l=jiػ^^i3vxr똴W>u A 0d׻S[Q kQ)o-QBտGvk^w $uR`-\%9jfU$_J0_? beJa-8bK[aY~gɷ %cFzYW++TyF μCM7?е[`T*H/jO/?iu,÷1\F=-_zSֱ(50*?:-j\^0b﮻]VGnQsOk{p./4:l[|8eeڭcz 5/ն|ִjksWiYnSt|zmͯ)פrJ\{`t>{oa,ƵTM YjIjUBd-+h2͒[ó+ul H}LZhsLxzV>CdR rZH TI-kmH /7Rx"b5~Ja2n!cTzF*|Fg@D\USWWJZuGN&rnwF4L?UEvNGA?ܡԊ9gAsWLLzb=*gV|6/Nk: WOL S%m3ҽZ.j=C:8jy f먔u(W:[[VB{`u\yD0yɵv%`T~NDgbsqdmD-I<#ulU"WsQWZuJ[uՕstst>\AE]%j[W@|@ގRC,f9^ zj-sv;K{/8"x};m *vC{CFc[Jg?:m硟Q%;\u9Ps5Iܲc"_n]VW1F: lp6HF');^sXeЪu^\T]?z+a.ҥ\ PO{#rbJPD(C1A "8'T јF5P2zE鎦PG S\-x ,3AI5#gf\oE6Bٰ..<.R6~H$sܸw=ܸ7^Afzc|E# kf7JJ oJ]XC.W~}m /D"BBe,YJL&0"~ 6on*^V(Q+è;eXS4ӑ ap$m:`!xf``P=!"}%X{))qJ7DdSaظo6Fft0E1O6ι_W;L DjG~fMs]UD5{ipD_.yhBEu1H؉`8 QLyR,SA0kgN[t+olMfWq~=dŮy/X Vطm8H%$出aoDDZn}j'tNl^ºaR/1dd!R&R/5eDDL ` Xy$@C03uc|fK}3f'Vٝu?J{#Nke-eL@yM>jb w% >%7V"FMsFV;||Z9;8Z օiЅ᨞ an܅}p͊C}6XL3Ʊ̬P؊ Ù슢S:#}QT㮗,tף>l#[.n !hA(*##xVunH=Q0!S&aJ;f2Rʃp-a5H}c|~2~}Ht1xX0U[5y9SzEl13w\' 7x-'Ns:Q`2T|#oA)I{R! a8 "%!A$x>\a/)I iq;"E RrHdLdžEwZI!ʓH&6qcP9@xAwZŠ-uEm`ʼ]j,xo'b`ղQ?#[KmMuqdi`rMq*o mRq$Xv{) R6r2J+H.{ 9j:;5~1 DJ8"B$ \ Mwy:qa@te %5J07.pb0ο?=L䱧s<*(b;KdR1 B#LG7RP0QE|6W  ΊNÞ.RN\:$U Wj'XcE`,>OW痁wLHԹ7")W V䷲\]]]/?0!FV+r1XDS?`Mw3S(; s*[w)ߊ'??o>g+J)1w[{ݛbm\X85BCb$THjR= fY> : V0bdG1w1R9*AGLjԪG$g4J a"#i`ХFb'E/ gz :TO/w\/_?]c?îXqpw s+ E{l>ЪaTC 1DMNbS-{k.@~} 4딟Cٮ ^q7]I]}OdwΤoNߖ`Lrw>W! l tMm*}>ұ*/lɜ1vJo'4)}~1:>2K~V;GJQ&?Tft go۩ Xp igH "=R(8Ku1bj)V %AEa[#)O 0I 6Ip<"b$1]œkBkB:8y{!(OecrN6cdsk}#㢼lIGEyْZŏ%eAy RmQ^vzRa%猏i|Kn ($vIQ(! gc ,xU%8. wү&Kuhs1{aR2h83JMt6۫-μu>]1rxn}K73\{lz.Uows8('SfNrIIe\V[:;֫K ݭ--ht|/xV._B.s f+%Ϻ <`BB7 {5tQ[mM]I6P}"mqIFϻeZNI-[M[reIw]"`E faF&>揸7-n830{f$ɶ ъ[#{|\Q&e"s{f喇yYSinND4a׮Wr=4vo.o<\6jQ{p8m(p@l iWirS+C&TfW`F2 y'SQurC TFo>Lz .7JZ㍝.f8_u:j M'N.В=OnofD4_#WYJ1|ߍ3Y_GMdKP\c`q 3d(:5b1rfٷ;'Z),X_ &a& #bmT>(O82"8i,Ԁ@J9ō8`1wa؂U]koG+D.#e :ٛ`` \l`ZRL )V.o5/ [fF3=s$B"CRj8) v֜M&v+E\x{ WTFm~*쮡ceM`h `ڒs UΥbk|i8)Ժn5mmcUkN;}UL=oCn 2\i D0h+}Q}/+C"GYRUc:GI8ҜΙ+2Ypk%FT+vS{|l-_vUPzRp4o?"n[, _ޞЦՍG~/4}i7Cw뫳iMr>GKxXMLGΣأ*#>$ntdᢦQ W9FǹvsŢcy6<ǍBE-=Mxw֜3QqK_=RTxS yEe{b2!wU|Z{\2 6rkܲ,ț$Ml{Xͱ ц-+Pwe*DTl#hUE XU͊#6.ɬr-;|w:}Cz)v,dbބmyhrR{i19"AԔ1k59_ 96STSɣө[W]]5(kH-%Z`i.cQ ,pjFWP-[!jխVZ`U jU-UճX뼩{Z`U 5Z`U N̪X뤨jU-VjZ`U jU-VX+jU-VXZ`U j53hjr(ZjU-V^7UfZ`U jU-VtVdJVdU jU-V 1RsZ`~VYCU jU-VX+|2 W-+ܲVnYZ`U jU-VxܕV^&_|x3ea7҂76)޸cɆ^L+3@BJ 9&/qШSL+ec jp b̳ 2SPk=VYXf12C(@.^&g29q1kT&1926VJrx N 1Q٘Y_۳-)3}jKPpMǶt⴮{<]fz o~|KזȕMk$5 5\0JꦽU\ڛ oۮw;/\z<]_.u:7Y;sz57r-c-R9{٥c^Dkݳ_GgU}:L[F]*@6mtB߶&GJ黯ZΝK?vf^ѼL3I-+Odo4g't-)SJ&@:"ٽ~bc3tz]6yyYOn3i؆}׽CrqkozzIjvצ)\]]#_ިWzt}؜h4nףyS/GdG)?/8ʵ3+/ʊ>n|8Ydt# xhXAG&U<6 gޞf[PMQ{΢C5 1e dyoY/6'}`5qViiŧ%^2{b:ڊ&w€_+SSsN}8\Ӥ.VzyD4o:!1T:a6v9(&,89v-C *Ou|01m^rQ`Hô*I3QAxKBwj59+.ڔ%dz0a/ 1ٜLk9W3-cgYoWFSmj j dh\]LAqV{ ?w/0kV@9DZq Sc I:f!ЦnV@POlJm!DuYCpB&Kb[9-v<xv[mWvE]1XŁ+6Dp 2 Y0:C*H|]8%UXcd-'`*emLD.)PD5jnX7Nx雔b' Jds"L(,I0q u)B9C| vsr,A1-5gC4Sen_DJx1|S] 9=Yh9׿?<}csGMđe`(9xCk,"kyF!^Z@ZGF0\AUdoneSF0FXaX"\q)^&f1ɒ>rQ KOfW狇.ٙ)K}yU<=2Y_RN%2PSLp%'h)'~<Ƌп9_8kf[Y1Zdz*f9o'2ppUϹe;}~x^Ûޥi/Q~9iz%zC \kƊJI*zP':7vMkY Rޝw^k//nFq{{ɉΦ,GQ\??=kV۾]pPp}59yq%0I٥O.=F,\3MĄFQ^,+pпuW^ҧ \eJ܎q`)V4|:r1ٯ+W;F71^<<|o|7_wW _޼ۯhRkYzM|b< ]=7{5tๆuKpXqB{kdIK_ Xr&nf Dݟtj+>/AٯPi_ǽYIzU`ϼU"ML% 4 /Mٍ$(y' j%GWAfsXt&7-p8_~6lc lBE9ETbJZ_Mk#CN`q4SLhmAC̙ 92 ́SWRk>7!H&N@5OLx1GWJx-s/uM1^]_J:ugcIZtN=ߓurnW^D^=k~⭔arjI4ʆɺbUZlȍ6L5oV\ h] '] ' 'I9ih=T.+ :*NעZ"ę܈d3D.w[Lq[%ܜr6/@| ݉LHw {1vћ>\=(]+::23qDҷRmi.c-UХoK+ҷ4Bk7j)^5]dHbleH)Cit)79sT"2DH:^ ߫JUIp]UoJ D n{Jw>Wmޏ~KHۍe6^6A hbCR[k_ MlKN&Č`t4 hp%u+A CcJ`00BL ]2F e@I>37+TD=pE^ΚkN6q4<#ЉfVdzC7`,ƹu ,wmm$~.~1vvE 5E2ʶTEix"iaSSU]_u]]@h4E!:$zD0i(b,KTZǙӖ8]7wr>Ms(eyȆТ { c|s|$e ̗tW"WP+:D%|,}fUW\ ɨD.Uri-+:FRW`NF]%r8uPIOH]%騫D-eǮu ՕR)| J2v**Q+޺JT*VճTW-m|w41Oz?߯~Rh< #sAH5R.J"j!_Nj_G(ԧ^b-#Xs&#>k<5Qyk)  rFZBY@Br/aN(E+fBZe=&O?U<~x>t|<8#HIuGᙓ^.bHV #TA֐ 9&r6ȹMٷ:2,"1rbZ 3!A)6H=,6>xUg{SIa/NC3%߿ hʩHm7r.6X59Ji @ EҒ)E 0Ë1}df+voVrEEHrhM%vlŤ:I9| SP'ܻܘ:5^wm4| ;B{'H{ٗ >59,C0!B*Caj3Ԑ2b=6T65`: il*vw3&Q>cBǔ ??@̭T:ZKm]pq ȏz˵ #Rh 6>o~`(Zka9N`.0 r# G{+%R.w}\li0laU3Ic OiC0\7,Z~}O('M V 3Q=g O#V\A+f$ PLq "8'T јQ("$b'8>j x/x#32xMMrkll֌뭔lP](3BYt^uႢlӫ*1Ԡ][{ş `0:]c UZX4NxLErAj9 v T#E43.J0ڸ3!o2$gkCYiI5pFBHFbR[`>%@=&9C}aIꋤI;zI;i ۃP,r+QQ='qTwcθ1nJEOo[>:j#r-XFei|YK4,V>KK1N&=⚉\N&P+?⚉JR|?5!SOok^>Oo[>Oo[>;d/57^ohXi1#{dQ%\ U*[`KNr]TDR{;? xk8/JWk\WpK*XH>HԔ1тFQ4` ij9,b%KiO.⫴\xL(DD&p,QX5u[9~Mr|Jdya|d!cAyXVV(l f={I1׍} }{K[t^^n̽E: ÊlbBc $PUF,51F7;KH+&<YVO=Q0!S&aJ;f2Rʃp-qj(5^#2{곑%vZ8)牒>ÑU˛3voqN/o <}:x-OH`]LXDHRqH$['X9B8q!pEJB䛊I7_ur:o]QE{;"E RrKϛaQiƝV'A$geGAܮ00nR Hj'คZXa݃k(+#wdp=h]/7O T`U __]ٚjkFG,6K! =Le繗 e#X.c ZA"H)tJ!o_uAKQ4AUg"@sŗ<"´7ڄ&4IqӠ$ ظ䃩{yq2"ȟoyȣ}譀S1; @rN!')(ԝvp}VOA ^k@04Y!B*ϥCRrYM w$ꏉ°X_G3ax?4]%XW󟖓_<^oiW` !:3(|"jxIz)5//?n=S;7܌cͅWӋ7=%K?R}K"`oW ;W ;i:w 2Aa8jt`ZŴB7_λk׳"7T0tcĥj_< F;z &TOk6Y8s_~wo.}}珯߽oo>5:HXZ͇G#síU[淦mqkr>u[Wq߻6{k#@~vп.B^k6?fTOFWRC_A/g;7oPU5TKߨ6 .tKVzZt6[2Q=i&Mp~87F_:!2K~V;GJQ&?v:'u4 (G,3iI )b:c]ZBrIdPbX;2U#?;*`I#"FSXpZh/HD*!TWc'T$Y{b,{PCrmQ)J ]wF\o#j1v XL|D1Ib$Jz+B`]RHJo# ɕ(ZP/N>0P R-hi" )Uh! t# Sa>O_Z">v󰛫]+s5YJ(~<1lhFJD# 8qD45NT 1Α.;y-@EQAsKGCa@wѡ=Yﵳ9F\\;Yڿhq;cYn\Q偐"`dG/jɁ?T6L{ sd̒,S*f cAG˙.ز`˂-[M\0Kz`P+CY*x"6L`˂-s:KGlVΩ _'T q#)+KQPa8O}4uMBt{5VW{Zu\*7KmYHE 56fiv`aWۤ5؍``'r knTRq\} } pkWOW϶tmp;TV\29:J!+Fj-H+#u0'2輡! NX:t 'c*@6Ds`pjbp AHRVa S띱604쵌FMFSg#gKGZ[{0\7I+)~eC3IqX9f3o]bŖ~|u= U"vc&J3gB6av~x ֹ\KZ=rC\MmN~V\L5`tit5~zXaaŘ vj6 FV-n?3͓`eFg=gކ.ꆉ0Nqt۬_c>fS:\82r+ø1Z i:R;i$bp+MiFPZ3Ejȱt*?~4nP F>%Јٝ/g}& Zo{ur7y~=E.BV,[hWgfnA}A˥hIrˊ9J[.|`o/* ;ݞ`tֈn(yZ\nl"Hf"d]$C,XE(X(6x2퍩ݽrl;i+DN[jno78?^ߞLmdtw |-wLZn^4bVG/,d5E VtߍYb$`a{~9Nj6zmrձ@[Ÿm3 #ڨz&md`z8;N^SiiNDNP+v9q؟frd><PNuz|Ę>H@9djBNb Qyd)8E@; o|ɇ?|xXcC`9?#Y"/;R_ou :l ZbL IEVΟ?Ë(Y#Jfg4S]W]UL0xtMh! x)d\mmUt[t:@*yJ!2 qSaɾ2mMFq _iҞ{UV.?nE7>5u"Bw@Z!1__`h.֬hjܪ*U>oWF rD檠2-!mf`-PG"՞KώkTDݭ%; QlIHȪrT1yUJ3d$ iNd,8ƵY#sznK+Kni%_.&6} {S״&$]x'ߦN~*4ua86U oԸoj[0}T!1[k +һ>>)[ b՝-8*Bg@36z{z3S4A)<}jǑnxX c@݆Xq# 'qWJWVirIJtDɃ\&asrog٪WR*ŅG<& &2p!k+]6fѥd[뭉+I]A/Mq=r|vRػv)~} Kq.ER[?]Q=QLiGTlԬ(:u,5+HZkV)jVšU\xJi%;t*RBWb5•t8{DpU6h=*=*jypUAW` xઈUָC" W[e%7FxF] úW$uo4sqχɴ' *z9NOɉFY9ַLh"VJrѮwI(S0{ޛ|_^O]]smBĆ:X~+4Y% x+SJL) r;Kݨa+b; cUoz^\aI'wn +%x EUVxNW 2 &Y(J,JnjUYeY r17Ε^pn0g_İ"|m!  /eGcbqwڿ$-gp&vR~&a1!GWE\ˎHZ2+WhʹųT,t +W2T+U6fW&Nd3h()jkz3ڤRQȽA!GNN,jb<$%lҗ?kXK8/1>v9w*_p+ֲ`Ӣ]eK8Lp:9--4 Kk9 B9d,).;Y)BV'7 6t [6LrOϾ4-ND\et5eKiʺPV'wSgs'9aW:4ɢ W]}4 tO Ie>XKRm9>m6Q,,rpR!8Xf)9̉L-=QgmmTZG]s}*9_G%$- QI*&JiI (MILA9Bh>V5!0s:pM"*[I/zyD4oJGR0rY?E|\+-8.Tem=HɱWEN70i=^trZ;CȵN'& n$;=pTDv|01m^rQ`Hô*I+QAxKB޷ښRڔ%=K da974sg5q#,e/Xؚf MXh:,zaeIUq7՛]3?/`7?vr3g7T~0]/kdV@Q'.uN,(Vh9@Y\}ԣVfn(O66QCАpd2uYCp:Ḡ$ԣmhV˃A㥨ZFmסv`UV('* wX4.gEIz2h5}V0+m|,rY 9M$dQ 6j"Q)"h5qam/z`<DlM?^u!bozݒ ilcT T4 rBQsCddI 0]2-q#b>⦻mWÒKz|v&vCۛ?H's]aQ]{‰ܞ)%W}@ߌt4>ElXJiaxr *UР*4&a+/:z\ZR@N2y,HtwY:*Iq-*,u%7" V4YORW떧4תƪܸcӠfMeW"|7l,]lJxڭ4.;;2.6n͙ҿv|Pu 5׆k>v]= GvLkOL֘֝A9}_fpL *H+M\'t*udsC ed=HCR1 ,FGÑJ7h\*ܑan7@/)O0+w)My3#rL%@ E8}!e݇>B87Da,d3N mbl'ZKr{;َ0ι.=~;|m¹>C$h2 3zdZ˔BENƒ$R`H&2\6#ɑk{b1Bۙ00b{&`R۽Ux(x]+~*.z5 [6m\ZqQ3nԬwM{ˮX=C}ừa5j)G?̃ŬoGia;XÛo>6F4F6fUYDL8E=Xzfѓw?5D[{F6d[MP\̑bE8G_Mrl6L?1z쑇j҉_gg{yA_~z?+??wޟqaUoS%I|H7Wxmū'ZCwkXMnxNJm?.zI[O I,GoY]G~jU/0WHlg{CIoQD{RŏKeD/_='3Qi_U^sBQ=`et0\Xoo@ y5m}!+2SY+U4N|O?HuuȵJԑ!'08)&qm C̙ 92 ́SwFݮ# |!&N@5OLx1HU񬝳[46ZلPrܷ/qoDU?{9ӗnEi\̓BvPcJx{.E)"| L|&D_A 'LRfI˩>IV"4w][W}@S/YQhc.A-QCMV*@(1QHdJc(#qV9֢,N1kǘ)VqϹ7n, C=t7._M/f?iZKvhc YaRZ ,[-3]3YR Т@J1c38:r;s22<_E"\.!(lƒrE$S,ڮz֚8g< hkM$igQ>TfZ;j̾զ.hH(`\Ud>Źu>^;[:  _^%ݼ27bVڐ!Z:_jx`a(n=eYGSqkҒD$ցVs]e喕["Er`PE !hǁ; * paD%\e|p|orJwkՋ9)>|;"dJ<'XȡD y'ɤ6c|} Uyژ*qKh>֭ߢ@0*W投,y[Yºok}mct۸3ŰaWH\wD4},1+\eVfzTHU',sA `)(ώ/q# b"IhPe(U1HTc411 s[PDX &rTّC_䡉lю\h'19~رWhxkxq7sfn[nwݧ.IȍMG&6hl3o~ 8_Vv{yF`niV -e^6j Y*)ڡWoXU)+`hԱwݽJl;y+&wmasmp^]Bm u| |T+wJVnZŬ8\Xx{fxWFdT`R"ylq[*ƽ1uM'o55)ڿM=i9GCxz6-epB0-(D8yĤ$B5DQA\rdPpl4SQ#SL ae;ɘd)LjG"qo &_5;6%1gOv׻wMX ypvzz7+/?ZVE^Rj̞8V^4I@[z6&Хg."#]|ËZ49wAh#^v*p;?΁5ȏER#\{|=ZyQ/Wo>/p /jq7n.wfwtn|?/}ܵt7M`z0v'S͋O=XQdt[ǫzNGp6]SBZ8Γ p#JJ2%( 'A yNw!vǝ 5I[Q29=Z=>X[cXj"tJDĂ)1Z9Xq*^r+<j׿]ǧAh6{Quf0.x>z߆>:)zw?Ûa #Rw].^0;Ŵ{O:Bi~ʰC|\/y\K\vT:>';𕙾3epp:w= r+gW[gQ$qcq 1"P"8:QD(F)pXJR O^0.'5aAZ0b#u0!eLXό'!JʘĉL]9V< B7pEvhyϲLsCq]r%,F"5Cbu?qS!>Gawv6"z`o"X5>$WAoVVs-ÚSoZB`U-m6Pˣ=m˛`>o}?3H&#+}di5>,5}0gdq9QbRJR4Wq+v6*]eiGf)+$dq%?sH)HEW\I`gd@`M٘,.bK4DUR׳hgS%lgq٘,nKǗ\} s &OFM<{_x44yBy9= hKmF7j?jF! tl!J$~K,9ꪹ„q\p㚥ո~Djuͯr_ϣѸo?ܧ{tuzyv6_SٚʏvfJN$AFTpt kA1`cpK{=B. NW8wN/ /x8SY N1G",o [<ړZdQ7 2i  T\`\$:fk\D[o7ˀT,ǖ _^@rnra A`nh5˳|ny~쯘}fae-YsRkNjI=kaZسii&xnRGW]kZ)VʫjZ)V{WP5<ȹC~H`Bv]ѧ(V㳕^ qju`UVu`բj-Z֢4BR bZ [ ڪkoGC4c?(q.$$26i/ VSUXS[J'2߭ͮc=v:7ŊUyD( %KpM%jF1[j7D$52B1ȸo5E4Y(^2 J<585Ex2R|vU)FWTfFP\Cp=i22PP{$_ p <2%Hܲr-O[Z" PϱcUڹܑKx@b: FT•[Vn7_2)wfwΥGߎH/,`\%NtN^$f,\Q@*/]T>bC1*@RG#]a4YN/,&v%~D]inӢ@0A;0%N=^hx$p-`S`ӳV\70UxK%+ۑOP˧㭖H߾ 0S ;!vq5Xd9qY*3̴2e"F"J<`eKAyv x sFIB @-sEA(Ek18N :刄 F%R 69`'0-&Ύtn+\0`C;Ɏ Mo\a</nrU?a mv.ԣ-%d?Bnmu̲thl3~ _p<2 RHYT2,A^Wku zf8.rK!$0eVDDN(!=VZrLhUi;*􍲮B m@Qifw\.d!Tˆgn=.h܌(eL_ zJnțMA eZx0SX )動 =.J[YvEjѮh'1a`6 p:_&mPvcb^;eB___~\#DREhkGqbtX%@ J2սP'NO̱)sl:+ya2Amr@SĦ4Up>xEXWFdT`szytqv7󌳳3K}96VZn G};n{v,^ 07̄ E/߇^1 C>.o$Wrޝ6-}6o{z^[<J@\Yci0Oɕ-#>uO ji^/{oF%ۨ҃# P")Ŗӏ_pӭj Ed ׀)5T.R"j0!+2 rՂNB7Sbz;4G+6κ׿ 'HJ'8 =r`T97ٺ\VFd"ǚq|l7D\A"~/1Z$ X<pHQ08 KN &ׅ6o2yJYVR2g($d$&9cua){9 HY"frذW)Ad^ Uom)2C*6#ELBv*eOL?1Jl!bRK0diB%<!1vJ>Ǐٿ-FjS6J JW<A[Ux!t`,CZccJ&{g2iD&D:Rb&y$LViǬQFYJy>%Nksٺ2nJ{s*_J~4s1JP0M帶W CZ69kX[_4W|lAHReoA)I{GU: VN\`}(ȗm $gh.{5Lq\surjF*#R"%ѱ9̰4N+#Ay'؋έʏAជ^ci `VkfmC.jڃk(+1c%H]/7n'b c`Yl:NTSY\SB6K'@4N{d繗 e#XU̶6< @P} ENϿz0tLBD\ثN׮";撟/yDè_W2B 1}`ָ KO3,U9rܩ|OW{q{Eݣ+4+0s 0EKA) v?G /CW{@04,(LBCwM&*m,QQR1靈k Su;>$Y}4mJ..Of/` ÐzǮS`m]>E1 nTE9L+nitQ?܌kj~|,8[$v|bs[ovQ{y5?yoz#PCKƖ_ES3,Fׯd`ŤDo\tF V Zm+XD򆒆i!Lx$5́|!ȯX$ >Eކ#ࣔ{>*&ۅ\rͯϞ=yYxg'<9{1h`F M]uFxi?\iiho4UEӂx[+Q!7{}oevTV Dէ=3t)J$IN7ZN]I\|'@/Ta19]{r! 0z@/,ƭEoEk#*/i?4 }c?VE!5XxZR3J5`)鄡ڋl949V s8Mʰ !Z :1 p(sH۔43Wʠe-M9-\Ǹ.q> mzPBd7uҾ`]{:Q)Q61˲rOUoLY_; &REF s>dⵃNQkfMCk.l[Z \NIHG9ȢRʔ%L4@zʹ/7PjcVibᢌf+趔OUzX6ta.[ꪙzٝHt*IaTշ_b^}V<䣓)?o2doELGؔ `crãהX\&zKtP}{U鿎q*n :aX\QbT;cDT3 N*)޾yR\<ǂ"-֢яt`` 5xVarb Ͱ 1k.^ :aV| #$BJ*j  QL-7}mBmA7[LE?D&}ȶ@_`J▗M0"vS>Q3mf|`Z=̝Og+ 3x4nL2t>A9_,Ma괥̶)ts&82}p#frj2Py)^[U67qa3m>ItC/DpY&p3:@Ux6U[ղJO(+g儲r-(;g9Tm_`"Ҫa-|#+)I_څHRS&tv7{ Mi'2ѺbQT-jm3@sFiԨݚ"X}n m^ (Hg[l%Wy_ 9!6o)wv="7J7ҲՑ4%{{pA"|L^C:#^f*ƎK^0磍:,;3[x5uW^<ݬmdRYA3(,R[flw]G?B.P F6 [8~vD{q'_$Oac7S/Ibf\ūRhXQt| M,nQ$үaz O^[W"7_Ni@`ʉ##诵ïn8Lh Ga%lphuu;VIq oDΪ nT.:t;͘gͱ)0z'ecMt\ ?t./1񒍱7 3[OqoB"mKR՗mJ"!6!^EΉw.w)V)~RmĄL[mlB-i3?wiS4݈6,Д [mO鴪ؚw!qjqOA"g4`[*0Q0GS GNb( j[&UްpaC;_=,sqQq!=We.3o3x2E66 S"f#[á Rc&({cD#*mhx'!jPQ)a7:u Q1&rC":;I5/H<%NMe.ծWK(ہGK(HϜ_Jjd:u=9_PR@ۼU;i @x;TFwus٫w;ozBH1BIKVZf $&EL *c>P J@q䂠KN[>^rV"nd\ j#c5qF,e/XXmd슅2  g^sxm'4{{.h`0mH1R9$\ ,jh!$)E- gNxRDt XYQ7/`2F!EMil2*Vx=|ʣ&'̄$2bWnĎ!(ڵFǮ+PcF<5JI+sޒ˽&pf,bH5t[ũ~!9i "-8%d."Qt DX"u:`<&x*XǎY]D߯C7j8}K8J#G dFXB **"zItcJrhg")['S"MZ(+*M8\/,lFɮ(*"pŃYo]>¤X^ZԽӾ,6V|쫮x+T^^{ߞNwAߝ__}wA]w}wA]w}? R erk3;Pe{8Q&hX)YиYy(] ˊהhr@fb-[ qJF瓊hH  &=u8?,K|I}Z_$b8Ydjqn&U},>VhkYO}Aځgl%-AԐHq. EB BlR`wh%W:hpfYmx$Aa9)Uf *;tM4R`;AAP4pOp8ޖÚ|ߓ>?ehT`iKϭ M:{K%@jy{4M4=t4K^" mU'9$2+K4XYwDV,*D2vgFqI~5EB86 Gx/^03+>6WL|"r跶Hf#!Iyn!}*09 `&irYimOmVoǵM<5eؿ&z)8g\2>FJ.0Y\L ^7sa2r2JrR֑>7TKH,'`3r̥'Zx6޻/&TM N/%ژ)MUS,﷓֮[QylZ:=Ǝ,w7z\]{yu o MåU ͊ҙ3wP{~4Cy;'K/_x|jNU.y':>L>\d>Fˇ ٣ ӈO?d &yo&0x.y&WW>8yJfm+Pg"&9^uCZ":PH(1HHnK6_;[:y-](t1hw\oz?~.ݽ-X,t3'3}C }oVba{=lm^ AB ^$#}IZzM: PK#d qf$\V u*?V=sx WkP1c4taDdHB)1Q&ցX"];(nv]Mk4*:YMŶcZ{bZz4Rn=|kKۍc5 l1 KkVgگ 6G([Rq/yU|FL&YFD_:qCf=)u{q,_B[0:ȵB(eK!fjm"L)!>Dr~OeKpmlz⳧Mħ?e21I>"d0hYhk694*ƤUQ$M*w^x4yvCKp Lk,vMv͞͝T͊bӾ>c,O6X QR&h <`>yQid*K}P_Ȃštds/8ѾC)Ft(){" kRn=^YPsY/|.JDwm~@'4ϧGPꗈ>[o$=(/3 Jm9tRALB{rK&Fh[%SV@ 8M<$L1jWB ‚LbAFQ!%cAxr%JE2fbR)ޘH y O! ʸ?A 83rG+ȷy& " ٠UJ3bd6'G2fY#D:{ tu@OJ\ml8yQቓU`ooJy斺 Q=Q=թ '3{d <(Ux)gej-!6 nqqUJU[kёsmRNZ*MBɣq\ 2O\YVJJR$}c)&lI#'إ,bOXc̵7ޫsUYCѶxOREM=Τ&EWvuQ=QL_X3*r\6ϧqךR䘤Lz"%]㯰ȱeXY]#> 锺ypUq^fbӠƮY.24JdO51lN[Ֆ '){\0TgZ{ȱ_1e#@>{.&`1AǥXԖ~/KKJ˩ mUE.s.yI! Cv6'; "Rl w+F:#}%\hl+M<]b昈]L$]7Bƍ ;&&PΘQVBއĬ]#pR)YltGT6x103Bz NEoX6r֗ùZxT9Rn:GR u?5kG;Zg gW `+xEۏ#0$|Wi}&r3dhFd*H`nY`H& V3 :IhC}-[nrC d4@@ʆV)Y]I RY1@3ܲMX\:)nwH/ю&"QoBRIdl є䬅>bsi8+Eh61hV{3DH&5oGwѡ2--wwEGi7]|6BfJq%ĉ9#ҁ%*F\ BL[f2e )n1yFZqY6F 'D IKԊ(5@-2d%ӉGxL|ȬSA0CAmcِJW :48ecrp^y5Tp}׈򎅆+܉ 2 D,k›2CWۥkTreƲvd Z/xiX sxurn2Ak{Lھ/vU>*&ۧ%ݹe}yӫŭ'v#g]o sR'nxsU֘[ןW^Zwş-{ɾxv[ͥqLmMevT3!2$Dv{2:MHO >pBKz# ="m/#yЋ[R[dijMVĘN2dY,izA-9HhHY萘}3mʆO>cPH:~!& /-3!(+Ė2q,zli强b[Ė:G4]96if*UXR[}*tJ(g˫N~sZS}ʩ޸"@SMA0~s(μDK!21)g IA1"jf100`" 4-]*N4[mcH_L9f7c_Gv lI1v|Q6cE(7+2%RREXOJZF0+_ԻW]+˜DY)hTYp먄S 8%4oɨd]i=E*C:8dH\bT`&UgH#c^p^nXpm4d鯫'űcǏKI.F$\%IF6)B"+S)x3r)M^UECѧh]ǐockͷ|5f n4 %[hŶEZ{BZh+^Oɗd%Rm/H(76Aoڷr z)L) EdN$B`t4 e@!1@A#8'^_IQARv./ݩpYknBbQWѤ1)т!D% SrYOGrZIDfM?op%?CxЦ'ˮX5KuᇛlIe=U^v/}+}9zߌ 7.IJk䨭}T˟tU^Qyc1A_Q؃Ju=f3ZոҧrF=Ȉ̽% ~(LHyBg:o8哲wNIahBs\SFqaI .o>_~*ӿO?pb p\o> zQƒmϬG'<6h| et5&<%Z !>~sC/rt}jPsG~j"+U^:I_j0 u'qRDU>F3UJbC<*W5Kpc%GZD~ "O$-o*LӃS0numnc,Gl %nԩD*ǿu_EnrI0vdIo6Z[)|XeD!D Fn[Rkt~&PzC9zM(.L4 16k8D)A.L)䐁/hJvn_ui^VjrIJmHSC9 .(ˢMHW J{E„!)l ?Xs41| >Z'ĝ|%f wnnDR$]ȉH-d\ ?[Vz"Sȉ%RБ+n'@ҲZT֊95՝e؝b2r@!,GGU,P*!d}HjE?*FGptL`wc9<-8.`!m"̹ZxT9Rn:GR u?uG g׎v|qlrNQ1e{bI|;&YB I@}fDd-c#,b6f} yfk$4Au`o64@n\W0ERxK]]l+ W)ŕ'vC.*,T1RI, -3miL*2E@uJje1B̲J&e,EP+Bx2@ GxL|ȬSAuàlL'1^C])Q19[ > bxn8}k5xdꗗ7C~qEJe ETeovK>h3 MP E(Ź3egX99Д3RwQ=t{W!(\diq0޷?m'Zޏzbt5 xj) c8޵6ۿ"̗iMvq7Ev \d0 Y,y%y =Òǒez9<]MOUXeR8`@ *)yrUtE B]+0T`cv^*MjYLyf(N;H X9p>wHTIȜ"9="yܕ|\K^nIxewWgsbY}+`$+M{/!_(qCN9`S2.BRQ!A&x,!Y`$!tTrACERȨݬnV89N~+И'I!-nSFSeLۜV\K\*]$ꋭ#yVE.Ejdͮ?JLEewp⳾R  xrҗƓ2y* MJ+\2)$"Uy !¼UU/I WT0-KvLng.n WBsۻzԜLW r"L6@gBN`eQ,4."׈L] ρ'ñL35x%st$ QU~)|Ei{扂{A4y*1GU_gu|.FRg\<ǹ9K3o1g ~NR2I_x|XތOyG;76P\GV3jv.}\~Bhv}OFVivRE+"%wލݮv^g_mϛq ӛ?M~6?DỈ(k=B'.6-?wylX4MucNqjavuqQw-/~>9SWrzDMvm';Ao8sAwƞہO :x Djc;z3F\t&t{-7Eyq5 kzf4,yYJIp5|CRptbJjiUkDi`DLfc hLP}2 p=Mio*.崨Encxicx* Dh"h$@BQV2sEy1q։rnq+7]L-˹Y^킄22n=\lvpVIكH`@>o~O -;7< Vs 9`n7vev I \# mHXe?q'WU2nUJzL6#G]~KFdQ9w60.l%O{1]?_GbT @ `Zm}5P P=wD1hX%u`|QmS"e24ѱ@CMƠ%x bK5QLIU)g 0ENHPF'=118e=1ׅhYE0) #(T J#c1qG, XXL36B] uΒ1G$z|p&au|B]j *=\P@J@dڒ<ìp!i$m2XVVF \頄W4{Ąv Ĕ~yQJ{0g\ jҎMQFm[Q2ؕ%EL2rI*3VKDFPc CVV+j"R[Tڅ,@ !hE#G҈5IP,TH!I 8akO=30 "ӏ q}\ 7fZ1yZ2b8 BFg`P֕>Fp8sQ6P4!@8JMJRԂ3|VQQdhI3Ttq)qGĻˈSB6,%"+bU>W L̺s*aQh -r%H6h4D]a1ya< `x ~brWEpCZ,E* UpçfؿHOD3tSHВ"k'X-EaNF˄ё[, ^5(8nK-(\b61}"qc( Dx0H$#r@-xkA(S 9o1D~>";/P9@Z@g^gmbˬ:o 8}ڹlsާ^eT,T U={<0:`!XL8x'FIlRR(׮bL0.OvUw.ykWeiw]lUYʹ [v]řR\*+NKe)p J0+>!B'WY\~2p*K9+\FN$vOƜ \t\p{B)^!\).J9B<R\iOPZ*JDW[kOM]eqɰ,W(%%|pe#|W(0WY\&O;\e)eeW`BSτs}/t,R#vVK_5'|FD67>rafܒADx$ xKn)F gq-=Gi-aYJZUBJ5 6 "j 6 "^GURDA~Db>ήRn*J wuáDukLJv|jLJZv|jLJv|jLJv|*jLJv|X;>ԎCPqv|jLJ;>\Nx`tQ~4Q~5 j? *ڀRY$v?h"䔹p1*F$Dj3GFF)\j.VVV˙H~I$$si^KT30Il*+s`qw}xDi//cX'6={[pg|Z:҆fW]>9VvT5ّ<#1ApB7,`FHMF8#Aj2";uNU}PjU#~v'撚l+ZŴ 3Iyd+&)8vc273(JߢPVuC e`:y\.@+ F}6'S I-ekP-@PȆ0R Zs&)KɍRDe.VLP>5zxHbKBA8zޣ!cH'O΄3Ff <РPҒDZ;u0Q zl]E}%Ӵ%^b J^` 7oVptnO~ C ēF\ -4kB:P-Ulܙgydc7Rjx3.4߭yG#O~\gyJ n` S!A[.J4wSwEgxHN0tH3=> \\ϙqNtiu,@߇'w*kc{.)H˔,LGR:.2'}(O#ɉk딖MZEh s FH%%e{.a/Wr.a/t {W0.av x%Q|68xg3m-&E먕6BRxeoɸ߲;o8Gl(✙-ڵϼ˝8!Nx<w02/Y[KIwȜ." ij|p!]I~%{>N?z7q1C6G; cW ZsD\QbHC Iy1 o\<>aotLZB<*Sq">t]?̎,qF ٗN%vvC.pxgYfrlQ$:=+]̽ǤDD\mFq@IǬE7S>F燾ŻOsZ _IoG7 y echBގóYxz4h[+.s)*j: 493>=?[wGkryr0h9vKݫr0s"T_oG 7n,%f靄|NnÖ߆/ͼ bb`ż{@yu}5Zp-+#:}ew]hMqF%ϟ/};DJ-=8圬BbWNDu_~?]o[7W3@o#@ݶض[L;X`AڮI+I{xX-HbG=|}y}}?v̅;o״Sqp\%Ro>Vo%[pz5oj6fg{-サ|noО. )8y`ܑ_U=Ħeaz7t6YTq[*Cg!U-8#΅+W_\XH˳VGzu9[S betJ  %d`w;x_ * sR'_W֏`7r~Jr:2fP6XO+7$)N[BL9s%VXt9rN{ӑ9>7RsfhJF}s椂!ΨJʍ"D}._D$9iw'LӤCp h!QA5TeiM1*Kp3Ұ$+󗻷t;^@akWI7ߵrѧw}a 706tQQD, %2o51lgG?;ө x9{*!r3& +WR~%l]1f҅ĴJ?wũ'o3!6m9IZ p99YAR<>]du&=<I{e*,B,|2_Ȕ,Tu'u.osnf%3>%sy7S34 `*Ԙ eYV!o@Ԩd ƀ ;ݲ-;ru%J(Tq%B;*>1PёZ\qmfn閫Y_.S^7fuw%Px!*NFQ;*^'HB%˻L݀غq^oB9 XKP0LlrS3IϞ.Ns]YZu3]GwxkvEh>VLѦD4o<Fft4N34=L5"fIM0H U.!Cb1VcDjeJj-b@4: \t#BT\FH!"rA\f!GҬ(LeR[M 5Yn%v2jcJ?kY?j=,s`7oP"LHa Y(l}.yՃ᤭3.yٳ\+xαu,nUZ AS.oeC*ݶnWO?Pq~l+ lfuju ?\,5/\?O$\y9O:fWHL@ų{[ycw珵5`SsyMάP'Y)?FiO.kYstgͅU^=սr;FÏN < Rc5FI!o$X#eRMݼ7/pR>u?A̡IG8/>VBH1FI #Aܢr裸}ݙ׹3o7}:;R߱i>'rYķA_N>,.9PPj9-b2yrK&FhJTLC B˷|w:~A,V`! 60e罕 M x D b6eN˜~!@y( 11tY:nET ͮ艁3`"x}+4H+ƨ!_A jڐS0O&AJQ!%cA. Y/gVAfvg/Ƴ:G*2Bv9e7!DZ\7::fVt0^דl`[,bFևKnj wJW_53xcf!&\Z 6VhVp"l熭+:h#K ,Q.yf7Nu!jCky_{ tu@O[mn<*CMC3n"/{zW{:gq} OƓ4Y MM5\,5 #oTP)1S(rйruAW+{t%9Q'F,),+czD:.F7R+/yUsTs"GwM8FCLB`XBz|95OٿvZ[i.Ir1E+a$+oEژh-u7Ylѻ)?(H1 H[7zN^tWs|z|=Zlu~~~}{0֛Vn6?MӤw&x o~7YҢ^Ihڣ" -Vzӣػ\Gˣތѿ=BK $?**q(pU5nWEJ:zpU < *{{0pE*B;\)eǮ^#\)zpE{}8pUĵPHžÕҞ1++͌prrvQ]v{Ї88LO{ه7QV^ ?}s:?^p'gi<7{n)fLKh7R*f2^m%q+M$"?[ײ1um}"0\BFz^ Nʰ֟uѸLKz |TQO_`v3]GM x-{C;ڢޘk;Age~?_.#rI zK}H~=ߪ 6ݎevO? E'Lh-TpfS Y8U~{F}ѷa{rKP+ڳ|n[0 kOKm4[orڹ^G|MMzTNa88OƏQ۬А]'۬,< <1yhRLcRv]\,m{U|Y=H hYbDbBzNߛPbx`Mw-wo|Td=>ٗ,Gfh,{j7`^||h᩽PK~ӁZG}6:ud+휬ɚ!ݢ\q>sv& ۟)EgW*y'8hL'GhÀOv?PJ l BbK7h"TL}hD4{,#B)1f.3^ :lՈf1 c.:NRATd@ 1z1pff.3 qd E(Lf05Zާ8DiO]iuJg'&nViCs !jʂͨxWub MV@ uv~׼D UERb?K7QCTyVᅊ'v&4^Kj#m'~(YYhNNZ`*LFϧw&ҞVS!Zq¸rQѠ`Jˌ0$ғ$Ѥ( Ve P7q6Yqn1A\`*Q-dIUBp9` 72pUR8#c=R ͌bg\q\f"Uak;nuq\Ww@...Ξ>@S#!J pi2z8(9UHi)`UĖYꆢL(()MFEH`3v,GMN I\eĮ&È}XR1k͎ǢھC`t(5'i{Kj,HEd (YRCL̐+Ң#Y@@!HEٻ6r$W|]`^37al`h{lKIN9b%KZRlJ=$"lpiD(c$rqqs.AE`jkKS$[s63s_݀x"Kr*Zq'Z7&t-;P-eFwʭLN,{&] LB1poFFM`L&D tXAnGٿpIb)iLkS0-NPD2V`YoPp@qWdqpߺ s!]x%Xs/mH \DI໴T Fu[Yݫ$2+[5ϦE Y)Xܳ!AN"u @ޏfvgǓ8Tlt1pb(KݸZ4#v[S_NN7UQЄFGaRU]8κ6nq85{(5P?ԔMg3#Z~o>x?V3a׋~8;-c;g(©v\8y[AHKKM- fX{3bQ8yN>_Mztx]&WVlkȈL3ܰNr;Bz"~Lho09xDNNTpNϤq/\ܜt?}>~w<̜ӏ'17M|\> 9[ޏ_wgO>t5_dЕu?+(6.IdQ/4UR}Y*&kuUA 8 f0}y1_^ˑ+2fp$Ҭ 'S0v}c~v0$<8Rs *-5`8۷udz~D"P9E# aNSHKBd&9ClWL!TCooԜx8aEl{TEjk [-aDPx xJ_6TRKaΘJ"+U^ ))=%\G4ݓb|h5D<:'J$$Aʼnk@rŐ#(CThl*iWY8iY!P G6ZණEw=_?{X-4xcu*,1l͇o;tݡ#.\ecA%qkk:Gm"WY)o&Hr";r3WKв1AikYQᷫ 2vH뙐P|.ciRW~Z}/ IQoW(S pJQ4Vluf5Cf 'N>泘Lux$VCX T1IiT l4(470RLj!9S1Os8A]`PlB8m#j?mK4;L9OĉyPz֣]1 O΄3܂MY>ҠIaITҒDZ;u0A`= [Eo^Mmm o<_dU eJ^@ f36}7zЯZ Rl,ϸFF_QV2QT:y)q@m lݳ0(tnq.YJ[M@җ7/>f_ALW jK=^_i~AfA/Ø|6}/~f)<y'}M8яQwOX W9a&2{ߟiҚK;5m>IJ:7g}ຨ%SdǍ R^g. Ͽ97m@~PXT\٪]e'|_A _fg|vЋ^mw߯Oo-q>^|nxc NV̈́+w;>K79rA"̜GGgO9Q<3 DHI EWpT&o ,ZGp Xq6y>/՚1wR .&gu< Ʃ-ً]_{\wc5^1gbqYY>9i&`usphaF$BoFfi\J8+(T* pu&rKg97H}D(dUI6ʩNl75whiZr]_H/VDIQ+9~R\9p3V)@ΒaSARZt͊4&]_^#6!mc)@n\0ERxK]rMU?U3%bוTV‚ 1|y;;f1ӎ(3T2D q4)Q|0b8! &iRV8%3xC-@r>!V'S.ydJg0-&Ζ̴-lwuv\_0Lܭތ֣o;}>;]<1sLR\DaMA1cI:"TQ1U Awa7!, b/~h祲NĨ֚AJ3C uD $c摫İ/"!|PIu9ENvS^~GY%SXb=]llxi~E{V<1C5㣋4sw_t%8J@t HܐS.hSx(p|;Ӫz: O1LYBrIBxO i)匌* z8YsUB2OD"-nSFS 2b F +"e\*X\ꋭ{+p*ź-Zp`hPJ3 YBK%x^MU앫L MJ+\+ aa8d3M`2&0%ZY~nIl,K-[;b["dǪb=Ttq1V{N@z1y-UHpe%iaHUPLe㟄C/KQ +ˢ3IK0}(]\tz 4bIUAp,K3X٧Л>v͝~0E2.:]pPRE:ͳ3E|:к%Yź9uFŎi묙d0;|xSX14yq$MBRq5VP~ռdͮY38fģyYÜhBo3^P'6K0jVh2HREsE 6(1dt QABRP-Amv3fbh8hKj>:͆aPNuzl(3wpQKOE; %t"/h<6 VXv>.^ՠ, ʅ{Ugp6|‡%Vu'g@JL] 9eA^rÜ9Q [cIY9ڪV7hTX[lٖypDy' FN,) 'u9vm(Ha=ߊ o[HBwQhfPV|Iyl%yBY`+<S&Wwi2ϫ'?0~ ?,}[؉J Ggz(]ھ3ŧqG?f+;'}/ӭx#'ߏ܎Zcj,R +싒2(PV: }Vk Bi%Cy"teR ]6R gvEnVMlZch<BٛϪK17(K3xN.ppcvgl"6^ze fH \%r<JrpWTi >JR|(pE5B;\%*1o|!)LSt@`"CD-EWJJW/H+ S~0peP*Q+^LTVzp%\\%r;\%*i•ԚC]%`LZD-\@҈ O6W\Zkn]Hӿۿ(Jnl`EbKxX|Z5]6|ZZfxxs,V)Đ))"W ĄZ&FmF:܊_Լ{~ 9# S8H%^eϳһFSG &tsDl n"/#@x%<orfΕ:Zf6ehtjq?7ȓ͘!#f$bBPPK!;#ASB#S;dl7]M),e /;tGZKm]pn)= "z˵I6ƈZpWnԠ 쀨]QQ SbMbҩR ˽)Fm7j0)@vzfxf_24'TAS%Tc˷: xrTjeK|0yYRvM&IT~8 VXI ANR4,i',1h ,u4͖JFH~JH1 N9ť({93@idl9Y1,l36B0 wt_r4NqM aoӲpnQ? vN1[ETD,'^F; 0O5RD3;Mwh6g#`|`#Juw1rRfL/l! Z[D|)-Ua9(q8 KN &` m0dFQ+ *d(QH؊pzR` sI&92JÈ9#e47>:Mq4Wqs+.w+N".q<'p_KK\M: QҷO1b3F<5*KJ#HE"^ wE%KʪuRcXޘe̡jzۀtn?@4e#x/>rq-~&9n髵GKݫYg[Mח yƺ7 ۰ 7f 5'U @~c:J[IJ)R`?{c戠g|=ZLq鏣RY@OznX觨-t+\*!5Q c1F1sɭ`)$ŤUQSO]fHS͡V[)sq> qF7kB2HrJ͙d$TDcRF}pm͏5&Mٮ,m+i=%ˈ״('K?Ϳ6 l*gRbr[ܷVkz^3"*xezG9:XS4ӑ ap$m:2H6֠6K: ybX{))qJ7DŜkucِ#)Q[(w)wb87`fj,825>AH Xjb_w*V:ZM$J "6 (WKM)0qZFe)A@8˓Go:Uci/uДDIpW5V3:whAZL,*YE1&JJVxJI";'X9B8q!pEJB%b:u wqji,[;"E Rr24EafXTq 7p0<W{AccP*iw>mVܛuwQ8=R<ʼ`-5+ut9T:cM,6X,_\{4=ѴFbY\T Xm<0@4N{s/AFS4gn7R#i䡍B|^?놬ogoY\:&D!R0Ht#,"aK[UaIdo^J^&LV;or>)M]/{<2(b;d2ɐŎe SDB J1uǦ(•#(3!]HtHPCԷo`B:|D}(ϳ7>dp]^,| 7@2%ٳ_A٧'Bv 0 ɨ[ނ./Iz6a} ge1< s0Yz?Ks/Փw7b|F̕nⲚ[ovPy<*/ \lrwI#U{wC뻙gO}KLN~{3UgjpJ^n[N*}h1?tqtwq֯.L4J)ZѨGXW=\gp _?|:wo^_t:7޽SO0Rc s) lE@jCu7??kTߴk] 'F59~oEn[*7;[ D=cpufGNw䷧&hŝ$t%Od6:E&MTt*!*_ْ*wKQ Cl&مߦB UVHϕi̖ĩcPZSG0zz5 ?!2K@~V;GJQ&jCS\~'Ɨ,+?˜G`Mv7lvmi+V$$,bwKdܖZd7ĎbbWzDNQH% -sDg#0R֑|FVOÖP)`$E*ST%BPg1I&yw %4.SU-ت G[Go+?mAtmQ ع>{(Je0 <ԭ?J -p … i 7pYJQ8H >OWw] Xsfh5^6KT4I_˘ QuA{4DuhHL;ʹLT31AU^@/q# b"IhPeUPb1Jibb 悷e:"/QM4kkQ2miU%TMn5}C;3#BMVW7p7_|GE}f\t]ͻgEn)f}JgHt/bX6]^/-}Ak֨l{NmwgZ"h*nmm; p4yf~͟nvHsY*W踺w[xQoH`M&QgyJ:e-knn<9ŊST0-hR,Hg"+4A,XXRM]kARO=[&)2^8 1[bEM$g,#%8$AT4!;q/֍ݝy;wštH& b?@Y|6~6Q_Ͼ/.k#G&!༤H<^&Ƀɜ&zIHU; ' P i,')YG5*>ic L& c1P,cgN+9C\'zS^^ҫ) yެJtW)h=l^pl~EO,x 6Ĥ5pZPWC4J `do < vh2;=yqǻཔJ9A;K P9'Ν%a]֢^ )%R:Ӗ320 ZN,9~K!AI" fZ9TC$7<Ȕ ɘ{vVaM_lUhQg6CC(PԶfw8wiNdKx !7pn;{bBPC{9Bp-c9,bڃIN(s\eKrBN?^oS&I'JtPI*lM:5Zg,#TqlpK! 7!yiz*Üa'GWvp$oQL5 r_zpoЧѠo3=Q9+tWl(k&{gbjցhṽ[΃%X6q#ߓFehB`:fD[=Z O{?;to0;]s:-A٩XXIW41:-d4(Bd357e`\78#M=+]ۥ~>$^EAW:7 H`-%4,-u1W=HGl̥IhGz<٫!!ʯ؟xIMJɶOSt:3 :,Ap_ dn7^d#3I"Za5(Aբ3hGTӒMhMؚyvMKs& kAg l2NR5.J2ex;aypZj:îc(8Y&;HlJ$Flaz+-L2*0ٹ#2tykU=հE\xef F)`"_}͓JE_M miCE_8)}ʟMޟdeG&7;L;`^|6I܍\׏Q~FS8a_Ìg7bG 9*BNOޕ'oTL2Ps)-$!x!KZwvy#*vD7 @0#hn2BCYT ,›Nt 6h1=q%UN\Fq%(ӷR}y&%#(Y*Ϋ?}AZicW\y!;lFd5r7I?;}yRzF)yμw-=nCdϓmіܞ4K2 8. 8.%É{>[ 8w`h0T}0nk+.Ĵt.Uûa<)>oHjCȧHڵ"q2 񘻠 6KsQq ygL4\ZBȁ`wB=2M]TmB*R+lL됪 ) 3kfP9ak 92$Xĝ5tN@t2Da: :/ vhɗ?[Ҿ- A97&'/2R8 d]:IE(K\SᱳU[(Ek P'o3xÄC0* $rTlkkfF}H?Mg[&);$(|n2;"(M,8alް)rN4RXbnuFHG$d|IgZ{_ef6; Y̗i|meIdw;xdJjz9tRd"U@x\$9R9AkML(PyNW#0v 0+6ON=Y2d,y[o99f)rc度:x>e>}2%45#DL7&)9AϜrDƵHMՌcʘ8ez :&*A4) !*T PI֌٬abx.ԅunuNuBܷmF I0/\UW;Io<{ٷVȡQ䂊,*%A[1P#n[b.$Me< YL\d{8j+J Q'vĄaU P٬(hRZڶ-]7㣒IY4jj5Xj%4"%w;$ [8>LRi2"C*E#FҨk"0GQd$I5p8aMw0usv~X4bB(IY8ՈFl͐,P;  3`P֕ :AsQ6'H!@n RpjҸ`)}Na@Q|2dLh4!+85kڣ^WuLbz֋Ջ^\7պr\9ZYjQ B,qX!&x Fh.bևep*lqc!x7x'wFQ?nz?~4~dDUq424kM|Wʐ %#s$4;=%P\pr./o.ₙW\x>T*k!. eD.4 3x%LZMsmi8[zk.xf.VSeI­G~zkzz3mn"㕦q6:njZVjt.I@IJo b^M51J3Hra5Bl|i*nٿBߏ^0U L+`ZKІit҆ 'Bk4zÅ˼'oMp}z'N:$Vm9-Vzź:=a2 .-9QF-Ȭ`EUvd Rִi$$fs9 ƀ `#OB#2jIFjp,UڅTd!Z;\12)wN"_dq1CXЎ4l.[E &:*$jΟBy1._. f7OM.7NY4c8;dsUqF l" ;+8<n|lVyuM N/&]oW\g;+}Lg3#Z9}Cz28{Fe\st׽-cd(©vRG xA[AHCKƖ|fXs3bQ8yN>=~h{ӝ`2y'Zm+2"0 k$7,q}0]ގQN3L)1sRsuם^tӿ}û.}2so/?w?L @M",KE{:McN3I0 x$1+#1ˠ-KQ?6dā) ŕ#UZ)#8#5·(JK4)xKm^=#0fN DRP)w&eI@$S=*r5h>_\??lnG Q]7;6>ɑ5_4twf=٧|nI 02DJU'2ܰjnh:GbEQ^)lpO5hsX'jeP-А,QkΤ3%s T 9%ԅj:q47a-=OiE5z{~UCX'omO=əb[ 4KG8),JZ():P E."wF6I׬7iRC4"B[eby߶ceF* F7~eUᡌēJUƆiMFrcC21fG'wv{~ȩRx5.T׿ܟ`̩70.;5dyA&S#ģ\0(9D%;O .X< םc;)ˈs6ֻ.^#:xşRe ay 0DL)e@,qyMwC9m68K5ɉO#AT6:@RQm*I$nf2&$XJ #V WT}IIKDJiθ:0GI]J8JU9B< $p: J;0sl&FCFY^Ma&'uuS>Rl,ϸFF_QV2QT:C)q@mSl݋M |Bf)MY 27I? O>pU3 ū«ܙKYAYh2 U4+a*R%V 7HLfԽ䯜)؍ X@2D's""B8K#NpMDH<a U[k4OmR.4_mޓV>czF}l f~_? .a[6Fn÷P0P-b]D"H䑙H@( 09#5'|A) wVTY.+3&'XvZ3&U ^ټYo`IjJxwe5\+Zfy fɸ8H>{ѳհphaF$B_ p ZEC%Q*8:Zr˖[XxENË >"wJhdD *$IqT'r˖[6n/ow6xY=S=K%х(rA}9e65SW܌U dXum"ko\ 3%lc)5Hn\0URxK]]F**|2,L u%`*C$l !߆N̴eGL%L'Bt# *FJd" NxCI)Ԇ (@ 1P re/ՉK+* j%3x ak':{C~ݾz[sրWoQ"{Í{7Uvc g솪gUNtm-Zv Wkdg(Rx͹tմq8 .ʬ2_?uX’ln_}RC͕rLlQR!\67!y䬛'Uٹۣ@xjOyE7xK9ώPfo-{serqsɵ'* ldBq a*KZE RwF14 ڱPiI%8cV:ykX՛ 8D }cPސJ5^ 4Q\ UhR HLj1 .[gzG_c.  IټLfhIG=O5Iْ,J.vג,_WMO)$|Gu>=y;8Tn:^h8yRb Ka! V:A*:lcyAUpG B. =0yjX.a8,$!FcidDEH^T[!>@©G :jlL&p96t+S,*K 5>u Ϭa/;Pm.Q[IqYIlL.3ox$(6I<΃l3R=qD#âŃ2D{C!f%u1Lj6;1 $Z>e6W.l5m- -ip,)LeX,[s 2 IܔӼyҍ$#_'52o-Ĉ#TI ؘ?062b>DHG,̍Ih %36tڏdSoQlxh۷z>ٙ>}R4-ő"54tė`P=,9D jޢ1:ղ:>yH s2|MѲӭ|_A;=#18Z& "b,2Qp,!X$S&*9H J%}e~r?;~ 'WИL'wOLrS+=B&F ^缓$:jݻ$.eX]#uL6<'v3Alc2/f=}}S$'I2Z$nxyKi_'CEUbWW]_ \%n9p"'\%n){pd=\pűw^pm.gEX-jr-c1"FE10 .cܐ-:2,➲4-9PN,=&^`=4 R[:u.7*8ʝ7\ 3fR/=tHbaJ7<6oh;T}.򻲏8dۋ]TlȊYr 8LKү>,>&>8wgtuO CoE@XEnX|o.ռ'=C}OZߪ't<:녻w"^9EU=?]x0KIOV#=N{,qd.疺B%ၤʱJz\7Bi)"!li*=.:NPzaDYiGiT JqT p+:2,"1rbrX+?"D4v;ǣZ-~#=Sցe+sA}ۧg24χGb ari:.l{n)H(\sFbOEQ`ݫ٫/TTHt=a# I f&HVs2F@-poDLaV`XFfhZ mɺns>_.:vdyx' d`#1zpLF8GNWӘG"h@Q!5WFGRUtt;c ޫgu:7 ث]_xi~"lW,7ޞKg])@2%Jb<_~LjdJ3ch, -+m޶m˗h[M^ǐ>'{ţv3 ީq S,#m޶쪵f{kK*F8H빧a5^(0f, LHe]ϺBuUo-OT%ӥ[OK?l]Z0`Xדⵤ%n%y)aK1%LNqUA'߾;p!PA9L+qs^H #wH!`NddyCC 1 09S `Cđ+< Nm6 x$ )[SUcm#`ְ2b=6MVHKD6xgtUh`g&㵮Im)b~syU|n~j FTx%Gh&fwgUINSzY&IvZ_؀\L&Zit^|䍳ѹ${hzPE75卑!Y4[=^92qZ5#0{މG7:ACݗ_nm)띍&SCaJTTc\a0Z;ԅNr=%$7\z FP[?.^[*؅)hM|p 5ݜ&nϓn$N:Y)2pn!FNHn񤿖s&BB=banLBSh/)4~' |Sf@{ ܾ=cyl/*ċ^<q$ަA%I(Kw rP|Wo7e.&!] D$U+P\aBpsni?=E3e!`K A`e#?sʱ4:$@ ff1wkިN륙J?Eyؙb7ry{Z6ʇZ=R?7_,, "Ja@5u 5mMg朗6v׬e3IasU4"1[.wrهypzݥϷgr-}yTS~1[ܬ~|,*GUeY˟_33`c e~~:HOǮTMô@~:ܖ58Hw{xIVY#NT%/G_M'yJ`b`OBATڏl6EwECT'Q3&Ue;b0ƷOܬue%&uje:ģe[o61хw?{fG:cIqLD$XdS5XCHFLTs$xKP ~vAOcȒVq_-[([ խfM*Vگ'XĒ=˙wL/hB|MpuYOwQxsmh=*f_\3QZ՚4%kZVӪLf[>DgA(PK >&r=Mio#U\ >ژ`3`5`U1WtGT42@Bdr 8Uz3p_fPm?N1'erA4L? j]+ FKd\ yqa#=y^F-V5Е;^EenplAVn~OB`&(ZZ@1: CIEt444g1CM* cA]Q|ƥsrv:ik*L5o ¦QLLk]YLm-O@<҃}Jy7 rx`x+y%CtǼ1.Nv_m]&:ffc?E$]# *H۠b*8b|;q'8%Ky`R2!AfvbF$fƘAD8XxI4mt2d֨} σ;S__FbT:Dd&ZpL;D+l!DJ:)PAmS+ 2?KXEC1hLyĄ 8/I~h3DX'>'PRҪɱ-)F8!0ere  k^ T)gId\NXRFzR)C!阬GS II8 @(%TR5c1q׌J1]XlfX\$KV4jj5JiEJ4U11"wh*jE ıEaJh!#2hHuMGQ1Hՠoa1qׇQ?Kʊq_4bL8+W5◢'*sJ @0(pZǃUBZWzq"DExbCфmcSƥKYsE5ړ!%̈́NSbĥy\#^jzq4+Obz֋Ūqmb֥(8R+KQ 8*9K"V x@ Ѩ%^܅^}Xlv. H+ T1GnS-[A8N Ռd~ 7_{>)_0'gBBl@}A’%R%vT`@ zl]ELNiO\Xhe ]܃o;1JvUgvnfǽeXW ᡌēFM. 5!;ɍ P-AK UTlu)kztlW4U>.wj<*AC$NrT*g8PI·v3O ו] NȾe9K.|Q|U ,15s(L0-Sdg2P48ڔyMwC9G5z%ϠݗU.3&+0VH[>a6A`frȸcXGl 0Iu!8#ހ33!lboϮ̻ ?Xos`d֘c 4&Py2cܸH[I5*TQ>piԚ:\h8AxgKPy MђXVL1q6|osejI ~đd1f/Ul5;ޤJdkY=!wȍ:L pwX0F3E1}c d1Ab"qI>V! 64&tFB%,hiLkS0ݦ*HAVFӬ7(8\+@ Y^N =$~_bdd QG#e.-¡]Z7  )hyꨩUR[& IH9͵ۨTE)8 6$4"(֎dwȒG." 9Õt&z z\r̘!-υh_]t40pbH0$]oʋa4p>>Qɟ.ٕpVGO:?]q\pVLG Gӓ`%Or&d&gϿky׆++s)J3G;+}ݙܞ݌kyr]{۫ar.̥㢗'cblE8z?][sכnzo!kZĬmI[\ [ _dy 1B({JqaBZnc ~JAiqF<܁;* ?'@.wJ_QTɈSi+EGRFp4><8Rs *-5`8ը{iL?7Rs{N[͸"P9E# aNSH9Jɼ#4)kMr"!TC{CSPjN<@=+pGQ{^~6Vs=[JքT,0? Y܉E~<{,!Xài[uF]eq8utU誮@ue5˯|hO(OK |C,=]7z͢/gT>iϔA;-#igqІGY| =掶0nF-gBn*9Kih_nVF.&(vJ^?zYrs92z&%# I)K&Hєq/tԖd$*~„Ԛ,tRN( ߞur_^Sx7Njjzԉq${4I}sY:X}l{Hu6njߎ$>n7~/x&:=a-DX{*}O=g"g"JS{*}O=EZvwH%Eè`*Fè|aT>ʇQ0*g`pf|aT>ʇQ0*Fè|aT>0aV>ʇQ0*Fè||]è|uW>ʇQ0*Fè|aT>ʇQ0*Fè|aTaT>ʇQ0*FèV[)&-29iբBϧw#5XLb")J,I3#P}1 R%h`)rx`4B9*8ԣv(D4Mq1CQ tWD!}2^| qmg/ 9s!7B%!6j|>g)EZ \;IR<$ |bH lP+0 BJh(2yU`)E#áŅ3`e@&'IkgZ{ȱ_ٲ~X`M>, ɗVH-Ev簺df0F]ͺE9,P4,Rfފ8,<؆t,eNvtr~wuVQcnqpV?}45AuuVtk0t\fn#m=fkˌ Tc& h`V3/薉8|25}]}.Y9no^˙&jIZ4' DQ4N@;.$LC7`o?oa=Vhxk׻.?9I T?t\8BDYGvxm@~rdeViF>ؔ&%o.Zp$/хZMYD*3ƅ,M]/MD>*Rl#[51Hq=R*@AR3#6m$}LR)Zr]t<בU;`—j63)|BJ j LV-5H5F$\K|i ,X%"FSV2iɢX3$j&h g6Ѩ%xd#)!$xva^΅Rޤ!TRba"Gcw!P.hDՉk X}225 !}tB$bLiF Y\E!eվX8鑬# Y2R.]ٶQ;TId,(TSr*[+DA)I2F[T!())!sVz YA-bil=i[EчF*\>E5U9f))idL|K8Gp4 d36q[%X ~)X6cVMDq|1 FEn.JA"/ڄ0|0^Kdt$Yt4rr\CHhp1 CMڶt5Ѓ”P$(dAUQ L)pVⴱc6S@^BB& %0ZM:'!vd@r0]Z8 lgڝ24mB1pZ+ ΃? -IH&g2Z16Fu& NF,D) < HƁgVr`*ta!" G;"gdD(! c#hJ&[5O 4]v+ iמEwi`F+fS޲ZPRlG,:i,e4]n$d Hh/ʸZ*abraJѴ㠝;xnstr(뽆5m^.w95fFFP]\lJ''40z:63 })h;LeݚjZSR՜@ΓDɪ9Iӄ eWkRehs8nJxKT]ЖS$F5 t#"B ]~nAr9p AP`I hT*m )W!vkVFm.[dĿv=_a+Bqp)FҀ'W$3`QNa$jo(j(cD cTv5EKsP\GinFl0Q S{[q1Kic5VU )|6!]LFL\@WY@΁?!]kJ^NuT\UM7ƤͲHc,a7iՎ\Wp$k 6VP5Stihi 5!W& )S"q2`p(r;muĉFj0rȔQIxٶ*rNө K. 8 Pk|u];yd.8!TQ~ԫ-۪߯ť\Spv ڜ%;k(O5 rzOo۫vR`,N,exzݬfD/TGl+[߻E֛)GC=æoZiuZИ!4y{F7}H<2J옥pm1);VUTTY [DNS=Vwv/͜7}o6|/?ALnZ\&Qxyζr\~u;jN}~@w{FTC3v!cmizSozl6٥6riW2MV W`˵7\=ͣ;WW{fKݛcr[ovW7wt@MpOwK?!>rEg= oA'O ;׼}K=0fb_u_57ry@6o{-nߒ$ Q_A$ +LʨfAaJR;t*JǬ1t*JǬ1t*JǬ1t*JǬ1t*JǬ1t*JǬ1t*JǬ1t*JǬ1t*JǬ1t*JǬ1t*JǬ1t*?JJPHf2p5$ԿA3`#N(fޜf=`ոty Nj"|Z.O-.>Q"`" &ɔR&*A`i9PZAJB)%=]GnQrI)ɋOǢJJntI?^\^{ů# #Te^ol}]޷^e +\nz)ioFA2E[: kI$V! ;We%9#||ysb+$YUI*mNH)dWQr 1zUg }%WJzl?1i3s4-ӆ H> u51wZ5 v<ԃ<|\^.k}ù(ĥz7"YN+S`)9'psx/OxN=g<9;=!LIE!PQFTc(jmI (sy\Ors ږ8^9s|Qm8BG\U1g.YL4,I[6Asw@{ЏZLph`hQEݒx+{R$xs<#)V! 0"Ykc&aK!#k#KxQPmAii||cY.ϖty8OKԀc9.W˫ܺ7q{7zŽm(opM4]}k\6l@^wǝfV{\‹zy~8];eNJRE3mnۿ/%_kNǿvĴ3&8$ %T x_MWBcY?󯷶Z^&z.~nIxǏ$-O.:zrO?ry W|FkK:GlsUNɰQM?&~9mݰޥӋha|#2s0١wNpp+<}>w k.jT҅ ػ[ =nF._&wdew޵-_ae=aUKG=틭uR"$<1l^@؍jc’H8*,9Yd(f\؂.ǾYBCW2ۿ߅zq4Nqտ٭x4O|__;۟M6ޯoя-XBs|:Ѿ-5>8_4w!kՌ}xhq3hL7GIpZq O٨{!7'ŌEb^SM,mˈM7Fwm~[T:MΌ L`w{fs5=wOɏGGps.ݸucg4(ӢdpL᝟07°zoY hqN- Aՠ]cwmwn1s*:+ԙ-< e "/X^`5 R:ٺtyJ3,zK8ne i"2 @5: ^NVcpAC6P.`w]yJBZ'QėRI+|B2zT%ruY%"J!^%K\zMi0k12φn:`a9gY \Ỿg.s홪k]Å99sp XKd,rWqtp{R:j]:3FY](Ʃ6+Y)c%Wc3]\+Xn87Vʺ*<i|~6 ? G-A}wײ";#IJstMedRt&EJ2iGp2s&}4 LooN_qkUfeY^q+t^p={La'?Q< .0C饭)RKe)L, _W-aiU˨D1#Eߟ:{Я'QQdB~! = GNj_WHi'W|W|W|W|W|W|W|W|W|W|W|W|W|W|W|W|W|W|W|W|W|W|W|.WHãRc*\BbsX\ }Z8)Hl~bs wgA[l5I8S >UM&QEV,p$JS\Io ? G_Z"_7@hпY*@^Ӂ#Ƽ/G~6CZG;=RʕitVOTJ1Đ)/WlC{qhmڦۻ*yً۲u| )ߛTԺ}]`LN~ #d17l؝d{G +3NK.y0ceF]`.OYUBZU\:ϣ2c u^Dfg?3[Dh#X. m/2~K3Y>&o m3 TTXEeֳP(ojL,L ~.Fj% kDzڞXVF|et2yerɾ\TjQ!5𱷆;|Xkcq(aeʁ$_s%ўo͉蕲U?D,pUgd>+tXfS|e=O|B/eϢ35ˤEro2WuU*RSϞg:-:I̘N>۬YJ+V̔uPaD+U,'bcf .w>J˺bJM/0.26xҎq) ]!\R+DkM Q:CtuteJBt]:/0u&/K+E V%CW%h ]+D)4ҕ>tIquOA^]2׌z:ee0c DWv=I]`Y2tp%sULue dȄ |apO]+@i.|du}FBJBWv>D^]] ])i 6&B6Lt%wDWHW\c9>rh ڱ;v7OOL'.<8xjD)CGZYшݹHݙJeC{*IZfŌLiGtBlx*!6=!6B ah cfA}uP33i8xOBe:_}~G/p#`u?}-3-*2#k"w* UXas7|%5f\GI=_).fͽ0Y󍫍qE2ՙWe`-M.ڟxz',Y5 Rj_fgxk K4GW001IZ?Łt ]N[->w_Y;M5fq3h^pjuܶkv>f[ňmw˻;^AP |߁8I^q+P}nȶ 7OQ p <줓͈&nm2mJx޵8}WZYg hHEv6tgPw,/KҳϡskWd[փCks͝w=}'+=]]l56nJБ5>W/-ؐ&-[v.ٴ)m.S]M65rv_fb?˪ysԪi WL]MlɟmN?)Sk5Ѵj{y55rNr+_b퍍QyNVc^pU2^;bs?#D O)U'pM-t(%#@r٤ Oap}2+@DRDWHW^ UBtu:tpM2^;uxQzItjJzqŴla9 Ƌ;0VWT[t%+Atulcƞ J%CW׈T ўt(DWHW~++_$apOR]+D51ؔAi 3ɬ QJEtutg%DW3 ]!\aS+D} Q: /4=+ }At ^"]M ^b<7VvDWHW:S W= V;ζ_E܋; Y|9 -lovgc]4W8|dmԶ.MgcHC'+kA+ld9"J+Rk:?vE*]?~rVCq`JU H]Ǖ[qJXY# Vi$-jVif*uǂAWվU)+ *r\\K":A\qGKE+l+ \cltp%+2ն\Z뺎+T<_wRv `Zb4^̖v:I 6k-G?A]{5Wv`:ml׬)Ak[o6+W;V^ڡQa bzE렔^ZНݢ4LfA0@y5wH_GWWD׶> )v:j{' vr>oɮ6ko{Miڷ~oUUoob7ٝ:f_5{@"7M{ -H(((&A++%!\w}DT})<W$زbpEr]1!ց:He۟$F[QN6=Ԧfjmq*oOWNP\9e iJ} U7AqU:U/ְ;ԦFѳٚ5GL;T1J6qoՃU#`y1B+RֳT k-ulP\`(WVTvm&fwdI"+L)B:+Rٵ7=WmO J#bpr.WVXu\Iz{+Ō5a84`0K4?齔* "KBI EsKRT23*g ۧE0W2YnJò}@jj4ϳ<[gmI&9v;Z\LWW2ᴪuEB7<ɯ_2)C./ ֫lȜJZq>~,%?CkߟBh'JzC(T0VLIjE{NRٵ%W]zNm%93 \8fj:H%NW*YAB F%J+RkB:E\YkQ fry1O$)9+ e+m)")W_ ԎUΘ?l$cL:%u#nJ5qos@pr c4Q L$quL]`%պ\Zy\Jz\ $3+=1vW:A\)AR62h_IYliTdQ;/ #y-A4d-xoT˦q_]+QuZٓ-78))Ft1!6ŌZT*Շ'bkp /gDbCHf]4=NWeuW$X]\UwEj:HemˈW+ ؒPp}Ӯ䖃+RklqE*quƒ\Aƒ;\[L-t\^1nwj$؉ch&Wy쪙ڷUt}+t1B\)"ڌJ{\ <5S(W$`#DqE*:A\ aJ Ic67`xWVt>$:A\II bpEr*WXyWTqp$\`e=:Zu\)):\iili&XbpErM1cWֹ&A?~2+l\\.J"][ջ*eI \\'K RW'+GȀQq1"+TkcWRo\ޜ1l 6|)##]5Q n4qoyh  Vծ\Z'+T:A\qm [ :`kJmz\ 6l-W$תRpjݱs}w܎FqLwolX 5Q):76qoՃRۂpr(WVTW+%U H-ƻBAqE*z\ $RpE{c# D)" wuTv j΋ɕ\Zû+Rz$qř``fd((t9*E^x?#9˹~W~/o}sQ#ףeWwu+ Z 3|^Yd0MKCi#U766_|7 ;ey|Ëa>Xxk>oiKkw6?R0i36Ě1\ {a-@d2$ %LWuɯ^Դ"׬2e>6_T*_<o=UYV+7k7gm>сrs?;ǎ|#ZJ9ϳQ>|<C5YX9OGOx`gMKk.ڮZw?<6y>~Lnr7a/>n?2FT>b`r:uvvyri;gK~*BB9c?pFs5^| lNZ٨<[G\bʿ]4,z]^ ng.Kn3o|χUCf\ÅM k6Rk3iՇ}H4´C(xhdCXa"A}->i?aY'DUKCjKC6WXZs68 Hݬҹsq-!@2IZVU>dy]DvJZ" g*n PU DHч@zKə DWH&pwd9O|?q/^Dh/Fmro_={w\q{~-ķ WZӖk:s0LsMY%.WU2jk0MFL\P)'[Uқ4hBm2&u2竴,c_Yhz) 7lkϷKZ}IkNU4|<.ꄡ|[c !$)+uĂIkc9E1[dYAME[%c`'D\p¦&jx%SalǬtY:]\!IlnMIv[ֱ/]v={%OZU*Sze VA2 :i0ө'1x 6ZatH3Q4HY2gnF|HJ,bPSw`&u.MF)BcO"*.ڤ\<; o}|sF;[ͣy#]K#7@f#o/H3[f~ܔm|8 >_mh6x\ R(9z̕RH ui%ފ 61fMx͆Y׆\vШruk/݀ ;NΆ Ơ>3.ZT9R` 9)IEoSv$s-헍h7'%aXk= sZVypjh3T`A[f%9C 2۹B.7xO=*}|OIz&oE$09gR5Yq{RAX FsϺjw! e2|9KB|pR zPY̕3nJs[c촱 oWT[b00&|%T ~=[<g]_ΟoH\ryy,KSqZG沵kJk!qscΞK4R܀. ޼fx`ޕ7dR?낼"]ݳvlhw" 5BeU@PB@Iȸ̗Y/;F)CBC3g HitL*"Q* J38  , (O+@2qYNXL(g\ݰbϽXי.jGK8 ޥƒz鸁xb,XCD; V?vkִBbY\'4mL Lh42 s/AF]QiR# 1ކ#z9Ӹ QLn ?8Nח" HsIŗ4"ga?B $!V:0kE*h`\a?{U.;)yoo9./;p[sitz9;/ǖ.(`Mƅf #PMKږ˺fH}3~`(^Oz4os#i'Z+HDr>F!LHj{RՓ%Y<ǠRJъJ̉_:v:sӟ|S?~zs~|vKCu$,cIIpn5дjiho4UlkՀѮDMvGy f{k(=M;#l!ӟt45A*$Еug~ ̼eT QTK_q@N/S:{Hz#Jr=-c~C&f8=yѣ(niYYT`#)EVY?4߀} u a mLaHB1 ‡:c]ZBrIdPbV۶vdʟbOAt= GD!&/9(V(?XCFFa<Lģ0ChY}2zQԎG!Z\-u:Y^݁[{籷D`1|!v9g&XS kT V0bz>O zC#9A8N5Pa #X"s!lMEyfr72_;mFI.ȡ`+oaa;F"f#˸=ٴ[4D[t/Mi2ǧQ+f>J90uQP&R+ b{ rf"B;,E½W wa0›SpGw ޔ˖pA ˫ܛ 0{_^*>*峲·"˴鋚.x2J{ df?)]l/Rl??HMQe;ɜ3_= m*v]&^~2h@9)ԃ*W\{缠n[EZHki;H!`NddyCC 10͝qDNT=q3ϥB Sۨ .S x$ )[X4쵌FMFSѰ 9k*ҪL/!6yG2t :D_mT]W\9+IE(IrSjdHZs(]?ҫtV`z4 d%Yׯvުy^+aeJ~E}ӻ}^g^;aG喝 Z|3^MiĘ5V+2T6>dcAa(Z AcOee@kӌO)풊B>NE1] ,6 |V7m/t>\0ظRb Ka! V:A*:lcyAl9f8B˹%Z21)B#RJb4(FF\TZKꩈ'"(ZCF {ۖzU^zW#@|PPݵJܯO(+53-3B h!Jb48 +MZ˽1X a< ~nj?Բ8wN)- !!"cVyj#7֋%$|4;R˂b9U/aE Հ#F(y`!dL3[&hL[;2[/bބ5kv]pH`>:~&>^j\]ܨ^8d ؐrN! L O)rcB-^jd6 8Weg㏉{̂`W܊ž"rHdj-5`d{قt  |*vvUY/nt* |f':W5~]݄eYRCP.R< *>Eڅ"2e2Cո CL O)m:Ps ~KL.O t-:=ם_-ϯ[Vu_}5n;g繹2nE:VMC7 C\5Mwpn]Qv@g` ;g=莽>Ϛ|Z(3/`Q(na66]IҚfW8;.]21–c3)lB0FB,pgL8$*SOh9ApպK~ǣNkQO^ۇ*q+ҥWɡ^ fOY |E;Hv}[ôGP}j]s X~.j߶yXd~.o[9YDf#oF-Mr͙/+^g)/D4@ P/Ňh֯e=NsUqeI^Bͥ=kAQaR^˸Ĺ▱g YV 's˭#l1beP-}J zH"L:X/1!_$5viQl&XZGK!zL_8k:FY^1Tk%ix7MUK707O/ Yv}>l  ~|}._wkܫ&Oz{=Š>ª[pcZQMI[;PqRO'5dh@gpq6a9U_Z b5v=8ިG?wa?gK8v#s9 6Bȱ$<MǵZV+ZI-@[$18ME)Z@ S/ ]!uHJR1G ɑ۔][GEdQ0A[R.k%'RHDcsV8t~C\Izx|FV)^%V"L&h!3AL) ib^w[Le;=:X^eA6F$GEM9#1'r"(ng ?[S1C#ziGL##(MHVs2F@⤍/|+]Pl7wT?f2>"< ؅ݞ :rdyx鑧8"FbX ឌD;_ q v`1VYE 62B6`#x',ĝUt4#|'z857ZqUve2T\J$Y*ߘL i rӅq Üu{iKn"'Jќdsk,QnWYJO2J41 =Izr6 2p;a0r:`LQ|9v)Q pE $S8cQoDc̅v ka.x9RGCGl kqܽ yĄ(2>UH4$ />p^+`o$Pt;'L80/>Mi?>څ4vjED*12(PZlpuVk7ș;,1ޫ tš. MSޕq$_!W%"Ӟ ׶)`}N\.d[a? dfUy2ާfG?]˟ZvɷşxM}=Ko_yrǿ~r7wOVΡ..[\J}} >QwNyh<톣'o./_\ {Od)ƴ!fJF ] 2R:B6+Nu%p9m&ӕLAOCWC_gLq~ezܘ.]=m_XtAWYCoO.opΛ+m-C+Aҕs1z!dfJͨ+AKJPrV:BPR1~3t%p ] ̇NW@LR:B"c+N6CWk/dxWWIMp>S{Tݛ\^]Uʙ5ߝ訫g{'ur~6~om~wXAgS_<~@?/msO{?rkϮ=1vk\FEN\vք,𙿜\v&R]4m噻f;/~׌!Lyw\p>a89nG@1O,{_g]X}Ǝ9P FoNeԮ'lt*~٩2ȶպnyw92ZJȎ ?ku)ZߌC6~G`(li;0n[)֚_"(.9[Z"c]n^9bҴۏxG8zF+oJ{\9y}J~)u5a|">̑Ű<>u {eaf5ڑ㱧}r93=> Zrlq7lyGiVw!^vWm7G xTlhx[)E~cwoo!9GڛS^^S-6v.f͘lgF57ESsazL>P m1rvl8;^cvTixs]hKޔn;ZN ҌҞzo1v6-ueHӝ$Z(6Q@dRM61JȖsnˬ%X]XDM>gO v3HX|&Rw [JGR̜1'Z^x@,C7a 99{#K)VK1!g@'-2~5/4:cۈ`JɢR0+OBeY!fBłsdʪjޏܡ g= +V;xy& lO֮ vO? sVV:M;&q b jqLA, )QJ@4`eM\7Md& /-H2:@M=5@Q"Ł.0͑A(DZgDwF"=d̟H_( qJitUH1H]=^UuA4"FߎCeW"y̼:RFݐg_TBEPޭ @ z2bB$&=a]1"LJeƈ))Buf =a.5^KM;f2擊QhG4M( /s~y vR0W6ů7*>Y϶3& m3Ska&a@ /-*}trW$FHVBxZRUf&1,:I. 'eQ|Ѫ`QBWE`Z HDNfʇ)UGc `$eq< {T5@&[OPI#6H܉m K.d!T?3yKnd*A.doPFe.C'n)~>zw.!Oa TM4`U> ][{ #C+|uic`}XAĖ.k*7U{9DQ5wԀʬv2JP)NӲy+Y:y0}t\B]1>yMߜt|^?|u~6n97HeRXl=݃ 9'Ov 66rDp F>wPif1h5tkiN8/)kĘ|{ZoG3ۮ*3,.wj7LJȀ# ṗ\;9]^PndQDGDkeސzPۊ0Z sA)a>Qfa!>?_D>lɹOKa#HM!;@ NTwpBǬ&}\zy})!To !iHzk~T'7*ڄ(?ŀ^pX݈kDoM˄6m6y.ɎCpa Jgg娲i~4Bpjf "Ra]YgFسF7UfC'8\u6CGC:6}WmYJqx9ܰԢByWQ]!Z[ȷq=])6e\*c_e"^jqNdtzrJ9BIsNP:樄N$X(IA{jЮJ(b)gSrX v,Nы\scX-D M6,zU>Z_Msas.m6^V_'yvPS}5\^kE"e? Z<M;)Pmmrm7d"-O5DŽa1_iQK;ۀ7wc/P^njo_[v 77-/5o>lsھ~fVOp~t/4u8/su7l|5 R2v7֫1/|<%̙ɵѐL8*Od5q><'$(1#D5sg?(,X9"zb#g4 LTO ;DW5&W9 ?4 > _+xҿj%l?]=F۴kCX~ 3NGΣGU4P*SȼT@+33JQb\q\M3x+zx2:lWa{xE,Y6lb;^/xQ t`*Hx[nY`& VD4h37e-"N$P* l@J2Ve)`V%hF3vܲ㖻SᗸLqJ?kszΝ_s_^ AJԛ D*GKG\uIB)k#Nbfd:H2b1DFpIpJK LI+Vd>~?YMÓx:-{ gD2qs5:1^m Elbn\0MxK]Z\_fGnx#i{;7R>; zm:U /|WpQC Q3"]RňKUoUi#s=QGJ)nBjey5RN# z )EMJ>%2\J$ȑG6 d&20w; \W9ґ6/4xE^q-%?G1+,ƣ Ǐpoo^Ya?9e@*||ףGm琶ǎrhQfE:2Nߠ/BIwFlMus5=pzeA?ͦClI%in w=%+Gr>8-Z1:hoL϶a+=ޯuR/IkL}H6,C[,9y3AyM. B|4SrQ۹()xh,pYޟub-ㅨ4qe0(5N80iZ!hǒHt~,BV;G0 eΌ1<2o4l3&,x:@*-sەW|ၘ8R6;\}= qHfBP$KR G=Hp3^/ncs88PϿbo߸.H. G3MhNg_ &iPa~^oo`|_ Ϛ7"dn]*φx;Jksɢafh(TZPkw˷y4fWvU9=kWO((it 2%OlhJȤ=%RjVex)|n4MO~W~1&(޾{]Za~rfh|u|E_y<^bF1_ @${C-~uʕh[F2ǷFzt +/%#48n.jBG׷v}b7MwɫZ>y[2vv;iN߁{4glu^P8z2i͚c#N}Inq&KDӄE+SpHE 2OQB,˒i*ush%"b..!Rh8M%)ae[Mc"fT[M'Jdg ej%yк `V ëWB[. Y -^5pBK2# u.|7wm.t'ޝJ_=%5  X˦Po%^Pl<@R |Χ3UsY2eRid9km<ѥ|9슻nO'x|䣍{&\/ A4Q#}"Aǂ>jJ}6%P\Fs)1W+R2J0%8ew}15!8=YR?)QLD)-(l2 !yc1yD %I{^ue纔Д>_O|yĂZ".QY!S YeVSH:x)σ'^[F=Ҏ>B Nj9DL U"2@DCEА0ԏSUDJR\JRJ!#p-Y`PВ?qe*\di4x]w~J{sOMgۏcTspv esƅ"J."ie$%7i[u&I4΀bR Xb'IψTN2c'j5qOt6iai\HvVӓ{zAzo^yi'qV`8rm`%8FWlZHx$c^{G #se_7õ&h BIl1]@,I1%}F#U2VeUjq[[h*B{pM6,xG#KONl0=%K9 >j`BgAqZ%hRqN 6}fE)v  ^̦&jD>dvԺ̥KSFKbkǣ!Q'ckiml]g;&VY%2 X4ƁQӐy <Z_;Mq\U{6>9$2dDL"Fd@QR=J#pD_YHls6~Ƨb--bEmr"(1j8UUv+Dj OR%hы8׵q!%zANeH!Dnemp*s3BufdȤ4&c];yH-q[ uY_zlr I-^lZm`ex#K%u.ы%DzdȒ!9|8C,ݸ.`5}V-\Q,[hG֑z'úfWr2.J*&)媔 5*yE~/Wfʀ@ڄ{fT^XBfZZ;!0 ?"CacT0}F)kk-7)@!kƁ~H)a}bκVG%|`:D,(cG, ihE6P' mcYwOuԔ"99D#?;K_xhd_N~M~dxq9]qa;vqOQ?5IܴkW7Δ5G>?}3үq?GTzaYa5j*̅뼗gFc+~'?tp{)[ВcvaKR=Ţf`q3bq3b^(2e'ܵy~q>|s V'\ꢾGd4iP.pFJÊX}= ʝr"~y?BGI'~=7z/?:_ޝ}槷wNo?z=}ק~zEhK"fFs]o M'4mjڛ4-\i-xWhװ&/y}^қA,\:u/u+$6-QǏ}4=SE!Ē !\m{ GU6[1~-`8y/@} 借..d?[ТȜJZq+?Kwr:Z%Ȑ48Z)&qmRaч3Ap NéӀNsL𹪗}< L4lmk KW gouRvz0^K6`ox+'3b4As PC8t=  \q9;* pUosRՙ"=gq9* pUԢ`ũc݇=M`w WO-\=MZwWOR+x\A WkO=XV_2Bͺ8\AQKz%~xy^.fr$N'!ۗZNxQURT^Y^CILk4%15Jg؝:DG/t_$ע)WP3Tل쳍A;o$P,2 m1no2/ܚ yɗs=;Mjo#b9zzyMO" ňEOJ}O,J hd^)iuBQi 48ұH+zRo<ثKZj]GL|r63|hG#.i/:3ɰ$7@\[e|:rrҒ\_ VrrПyWj lRzrkXLWosu+7,s)!cjo, %2un>`3ϕټǝikϚ]4׿-H*bW "RpblBͤ:&jIumt2ģ[iX2<1e & WS4}q-g74MPM%7J* A'FTNs5 B}udC%w%KQ)9D eL@Q:"K *Ml7#j2,'R\}ps,%g29q q'OrrfiLY=`O^pQ#Ri* w@xK 7ښRDEmmʂAF2)@&Z}3Tؘ8ˑ9[1,lL3B0 SJ߆2>yXћJ0/';X<Ttï5P<$eQ'.tN,(HZ+4ac d$एِѦF[Xrgi0) mAK]2|;f]P:#fB0b7&r^q1ҎuQ5ڮEւ}LVYP2x2rcSdqHDSЖg4H&k"$,C.ɋ&$dQ G,n)"9hac,ک/"`DlL?DDŚEDZDlѪ oU9:2+ rwB,u":-5um\@L^2/cJΖ'#7F%wYZFB.8RVc!2FA 5o> Mu \TŵOgiɺ "oqG Q@Tm/S юRS|!Ȣw(E$@Gk%Z\.Mc]< XaӴ5 #5iO/ Q+я;~ZBɤ"*eC]Ja$xRyk+)Z_- FAnI2TV8Vc*7JYԳCȜDY)XTYJQ NkQ "zM֔@YF.7AvH{طv\6K nnloixK71miTfCnu3`Z vl >Zȱ}P:ܔfK3pW6!?uW;ٶ֎,-r[&W?> [ەeV_i& Njv{ eb^ CBRG$B`t4 d0qƥro9&q#ј4Y8&҂.#@hg_ȿ/IR)}R,aIi1*XpI.HPZ-AOlwOwO?@Sλa8ұ3tnwCLX4ܬ#kש+l-sYղ!nȧG?( c$zp\g*`) Y8s{y++qW)ܳ 6CX/ ͕Ů7޾y2FK􌪶c,Ϧj{BR=a+оWmOXJWmc߮mK`LfGvv{Z'bTx> >_vlGٞ<ݭ!aR/1od!R&R/5eDDL ` Xyw XηMԮn7ehnA\Хd̞]ҞnW]бZʄ2a^!X&HSE57V"FMPvF.x$y&8lSh{y\8B5'`׼4j*3enB@V lNgGs{C-vNg6v.`߯-!11BXeRcཱིTj"Q*MۂHXG"\JlB Oc*5(K)µĩÒ8z|׉͝vZ/.SS&0 v`VՏomb$[M>IN^j@ d$y8R-'X9B8q!pEJB䷕&_u wËY&PE.;"E Rr"5EafXTqIn`d{;)IKߖ瓕<LapvZ)k`Ҧ_pVӓkU-5Urxr3¯Yel2BI@|ppzV-{(Rt1/ƅLNhxUoz`mq4JYE6ڴV5%Cy$Ḿ|~Ov3_1KC|qyqCPJh%| O7߽>?O0Q'op v`\JplBai7B pcyV]M SSZL-X6żuyo"- ځ[Qzt7}UW+nҥ V )]I\|ex` DX P{Xs_:&p.J 6S5]ْ:cJoa}X+/'McПAF=iR;_S3;q<0[߶Q6FϤ8Ez,Q0>bRK"ÆFc{Gj}yEBt!ؓxDHbTŜkBkB:8y %,/&{SS+WM]vD:Fw^EyI xcڢSzx+:K1 dBy]Pv*n&&nAQLeҢ~((bAX)$;J!c$])y*\vPބN>0P`[&DP ɴ*1lˤfIpt93J ς e){۲-{rmKi> f)\lG&GgVYSzXFۖmlN> r@cjwK y??yIY N0zBktO88Et!eD&5! m۷[BsuUV5K+Kcp{6^.j)uf) vvKo>zWƪI)̹Hu'tPB;p[e[{j0'2輡lP'Hpŀ1h#Wy.NZ AHRVa S띱60k1h6[!-+2C} }ӫ" {n#3Mdz5h+5|31{ nz<0zɭS(BIv '5;4MF0XnsKeݾ@ȼ`m ׃o^UnyivzH܆fm*q]-{NezKƂ4Cۦ4U?1i6=Wo?re˭͟6'X)^xGKuIC\.G9+Sm0RF-uUU_owy?ty+ R;npb.gmPX0JB|Xӽs|ߙwsg^J䢐sQ;d^ۮ&8ޟ|vqH!!,P.1F@1u(-Tt L9X+448KHD.z GLr Ñ`)% 1KFH## .*%TDkQl "(ȜN"sF^lfW^z3Yn3Bv%we5s+zd]H5"7>DI׀=Ik7o (,Á0y{ӍOg=<_wN)- !!Z9# EƬFn:KQ `I!{U020R/6 'Ϟ`̣Jb&Dx|H@%rA 9s}g\;9̺Z4l[ !)C!u'kn`oGidGRB<1p5&xPŝ}bφE"> PzI.vU}-Oanld3bY bΜйATا0 Lf7U!~a"75nԍ鹙&L #׿ FnxAPϮh[! ʣ-3 Xg4az &`0LP~3iڇ?o88r5TŘBD),jq2u%ޔW5hD*gR؜`rȅaEYΘ }vIU̜ͤIpGߥ|O糣_ү_~_r_~c7HQE_ba^e쏗ד⛀K@c}"12Rx}yLC-Qb|)H-Dk=x! ࢱeͨ +ܙNs|Vf{!ட8@q>}jTF4,LCqSmnFW-s`n3ly9?*0ߋ+6)n1F/ۀ{Kx8pWk6Ѝ//4- ,S6k\@7M N=F7UEU6^e5fhywU (n>L"ļz] l*:i]J7 tt^*h<ݭҴG.KLbտ}~1-3l(T匋[rjhtpBcd:aZ\H]1}g僘dR%&O\LAܙv/m]8{G}P}Gv3jVh2iIE& 1Y1dt S0C@RP-A}V7 ( Ho@w;I(':E)#lbL] f$ w2u!'1Q'\ލKiW:{5[(~xP/{Yu.xZ/W'i3O$g/~n<-֥7)o /_r/!^$[3xa+O|RY%z$hac`q K(:j}_/bN` OGtL0酗iƈXSe-H=3MmC!Z)Ql=N;[0փ:H9A!" 6項\ 2ڲ\ZϪuvOӲZǶ.'}9GW^QbzkK3k6^K.?K>W2:?i,d9BOIcbg"'mFR S=4lI%N&6ʞIg9MEpK>j k;P,HO]G^`UZ=>JB@8U2"`%TInziE\J8ๅKH6 HJQk9i Oㄊ+ЯtvmIv54]R|".[?& Em.CL%>4XCq9t@!lOPS" X>99f؅]bjkO^mB\1ߚ09kBT@\)Kb.SQ˸I&&^L)(A1HT THlPh-TB%Dh]D@eL(,IP5hNܭex0,n$uр6 w2L fF=_5k_z"./r-Z>;I/aByS\@ #G,IS#P`C  zADJn1=^'(W9;<W}455〓@7J߃X\qjCmQ9 Wc 9R#r_qMX2$$ y("'`ya9+NO9kW f˄- 5>}Y#$oW+Oq+q%I*'P*6 x!rpZ3UJײ~3B'2I42 p !C #& !TjQ#%EMOhKZ{)Us% *h1$)„8q>L_La֯C3A \_],F\h%ڑGWE (NzF-۰[*36!@!r&ZM.Rg9xj͘ UtEbq+x|*կ|UqaSZ/cYR|4EM3.-lodG2,-"m׾1&:Ze-my%XyyA4!@ U1x$(:j[V۲|%!ӡ51,yܬΩ _g4/VAhdIPs))YBB]Ynа@uDXÔhLwڠ˯ɽ$h]NZtrݴYZRm]qPOp*z=I˔N]7R1Mc l8U˴Z2=RTH%I$;A %^!DDD4R AP]QPcL 1'(wsH`u< ɡe V:SAm1vzZsR7 a ӛU-Z;w<[TqU/4ٝM["/pUK1^\Vf9L 3f2nK>Go'y9At5f-3đ l?m{zP^}=nz˫nϚG_?*< 8w8k_-/ea_fssZL/Zr<1Vm#Ε'lX-7j)pBx*)Ay$8MLKG9W *yV)IHhC>w^u32gtL. b)h,m?cz\-mG;o>v0Ԕ 5yŌU&y  TA1T i4B^ˋe\CU /'AZkS %4DOZW᳘uAŘT)s邫a 9vWuH_?erewW̊'P͸!& h#84 x%,QAAx("TөzyuǻRZ;A;`@T. IgwI msAyJ#uNͪ)'/J*4Ihge4QPT>DOQŠ@#28.-Pԯ5o;B &پp`hSE-2ħ?R %x1> ;{j&U?^wUqY*Bs }}/,t=Jyd;k]F}`tT Vi']/#^Vx>LrK<OoRኻ-ڰ.BV.`o&$]QQ iFwxXu ߕmlqw:]1NwkkW\O>z eƄrx5{b:&f0YX!;$]my'].*3CO"f21Artz|?gU?/EfS6Gp3hIkeLlT;7T=0;A49hnȐTԂƔr#:yV?D8 zp_Ƒ>;!I^_PcGo@<}.rxqɨcspR҆iEPz^3 56ae^^>}jfzr4.66\ݤ{3^~җI専qY֟T3_(SZR=?Uj3y2?γ SU3쿢O#wel_zYUl>Yv;fN8CxzsÆ'g1_zcg_øO<ø6?^\)‘nTѭ[;$ۦǞ'H$Z DԂ.B C5=vyOjZ~XzmpܠZݰv~8k9Sc:7t\]O/gso0sO rSd̻zbu2E<,?^IIw; B3_q0:'n8 )B)^]JsWt6A1=y鮷Q+f[S=B@T@\)s"XM21qMSL)(A֤s*$W6Ξh2Z A *9e"Ѻ˘fQXFUT/S;mCQ&{Yk/;gF=_5k_ͽn!R,_MzyʛP?Dc) X G@UoH0 "xY%]y`^Dy5k @s&1'oȿk94Նڢr$׌3e ŅI+B~]` %d 60(:ȒG8߯z˖eږf%ŏ_UPR&2',Xq`,zZ;YyUgyΫ4[l9]̥u@Y5:rRqAЇlE1+Uu2y<a}7jNߋρz ~d'z` LV% Ir/2rF9 }^qγ2:Tg$գZq:NZN-HZ9@úǓ\k%-JRqLU4,k&vQ8+Nl/;QZ5fR ,);24Rye2G8$b7yDH16%^d[&_5pF1"s{w]4XQ0/A,ISEscV$8j7=dlY hfG5 tvʳ?E',VG塢[nkD[$tqWJd&BW~:?MK70P<rke& VY"fl"f6iNA(wYVa6E9ZS&1d%c *Jq!Q:% &b uT,6fцXvĹ]FmJf7?+tm秿{s 5׫\^[s"QG>NsjSC~,7>OPnt|4Ȑ#6cM(,5(h+t]:Zgn]UkK8Lmq&%1Б Vu!Kk9 B9d,S %bI1[ $>ذu6\YH_~ijTJ WP#{Sz' [o^w4-g/;/Vٻci8:p/ٯJVuInJ!Sժμ'7ڒˌ&" *\ u4{Q7shˡZJ;J8J0'N業U}~&+"ꦦb$}rġo xא5NȁFPO8VVP6TJ8!q-#x8X)ȆBvItR VȂ7U}m#je6XNRT@!8\d3˜τ8%48kf)ϖ i҂+ij& #Iy9efF;m!ģ_}i`eݼ1iMmBWdDÿ`L[\r(X40-JNc^{ǒ2z岯{G[S!0KI[䂠V] S s>HFj܌J5,b{Y6==.xi6ókI&@ɠ?~Ɇ!$N\X(GB1@t >f开ǺV2X=!) lJmaL&ێY8sLHbkxfa:gbn jҎMQUFmסv`FUV('* wX4.gEIz2h5]vFsDqfbJ@21CdEG25 YԢ-D )"h| Ĺ[~Ϻ b5k b9=*ZU%72+EwB,Ret4umJyIʺ6dB H64 \p&-a!{2F (TFj܌'^u \,*ll2.;\Nㅐ|p;Hɕю x{ [Hd+qY2{)cch DۂմcSP #o/fϳo].;)Kx+J.itoxq3үٝWӏ ϓ+֨0Kll9HՆ?Oی?rp8| h1%PWt&b(2e% ?O v}NjuU_"2g4){#aE8>~^?{7mR|cRkTN1pvGIO߼W|Wosa_h cIıJsIY |-MK k4fFv>V5hqHXO{Ot:[5*UzziᬨoҋhekKb3"/fX^Fz69[S(v/-`y\їwvWm5CğWhdb)2VhJgO>4$Q\Dr㍣bFk >Ĝ cp(# 8 xg=wd[XroLBh8qm< *3 b\)ǙO~N[]_F4'>Z n:$~]ЁHpgNi^i&I -SgqA%|-xbƯ0''t%to@_xݻjV?;[Wy9,?,VqwjbR֛}_֋/jq7;MzyWF*g7z`G^_#m읎KQƣ|8_m10|IC7E5УZlZꛇ=LO%1k-t2SH1O\ hoѯ7˞:h47\2NlsJ6Ђɧ6w/f*΃)3Z wQS>^8t4N7CRc]2,P4 YjQDkWxu'R=ҐOF\z4Xr嚃IM{괷nޖ2U7!7k Ym"wzR-/F]h I&F"bJڍbQZ'+LW('3=նDUv#'+͸2)yW X'妃+T+Yq*;\$ {xq OW(&+P+)m;P%&ߞ",Єpe9! ]rj)iw*)pfpvzuN8g-碖ט-<ɵG|[K=ֶad+ʶ UnqT'+ld2B֤+P (h;Pevp*2(қAKW(WTpjuq*+ :\Tpm:Cd&2C/|9P%UωҲ`-#㪖\{i#/TG?•+jߪZ ,4\Zێ+iB^WLi+OW WQ @V@ .%IW(ؒdpr_ccCBTWpJEdJ ~bM R P-m Je+!HA/M+;h 4^]=˯5,͝UR:]uxR-ʊ66Xڮ&s[fj)@%b\KL*!6!6S xaz m kJs$-4L?Nw4)҇;)!/h&rgLd1xG!vYo@#o>yt'x) xWYYP>/]gP.f20tYj+R7_]|罳Tﱶ:5k*d9f`䡀=u6{0_sQ3\VZ!2o;bweei:];{~3@W;0C2]2~B 5A2 hoq gN^ԝ*)agߧevf|/I4w~ ny`]~l,2Vf5чO'q|v7h)_qj7x7uP~qׯ50˨_[pEʽr yc `ivI$TҵIsʌ>UܬǛuimCM<7 27:;y㛧Cًp6z=4]FȳWuczGu jޯٚBfil.Y5jaȼR) &T*og9$;];@P?$f}օ 2FaPW.}]ڐ *] B%)QNM.Ut4K,tI@yeg0cy~+埋wBv7~,Kz"3RlLi`.8/jo5dVr1h=ds˛}Ko4jnwC:{ s\dU(`\uguwIVﺆZ5Zxqt 3hz`c9&Bl]OaJ#K2r[HC@.YF(Kh' NhY*?Qmn NVVBJ&+E*BAeW+c *!\`.D2B*P5mǕ)Ѽ՛ٱ9!XG0O'Wzja=e25pe:\[mLBF+{l爵cm]K ) +  @-=TrqѠ]`EI2Bܤ+T[ J:\ pM:ޕɌ]Z`UZq%r2ewB;{ < K˩(B). 4xc "Ƞ#&/d&2g UV7 n/~\Yb\8O_p3%UM{4}["2_QQ@[( ,/D^ra?+DδhO .gOa>NܖTQF %\̸UUYSLʮʳ !CclFKh17MҾ&w_i7np V.B/+P초 9K! ^Cv |lڳ~*6|wP6w2{/|3ݠG n) tn\xme2+MŸe_~X=iew4׿YC, dSTVn<.{ן2)3ŝs%L*j+7M\ܹ2re~~j|; f:]Ow\\/i;Sq*/M*5fЬ.{!V^]Ao )l~§8݁e97W@2#&c=՗^ xBJҔKس+saf _q v.8f6*Dw>(K?akj{wՐ>зR$P;0}P7JɄ6Ƙ8yf4AaHQ<2r-3^ä'ӏz}:F3䆂 ][*p`uAy+8܁tFrV|?7"{Vq4V0[޻|مsM^xkg^p>w*sN:t]g3O†=;bTBZHLVGM -BC.[D!VVIG]0v(E* z@a}]H蘧>:f\B('eȵ&7Q_e4̏n>n NEd]ܻ+y]ߔXj|2x]DJ-&oomLrٖpw4Y4E|"Q*[#\!0eFkG|ΜrƵ(pQ2*<''2&2@OCasb0E!AQmp@i26&g;Ucac-c_Y;KE-z%>,wB|O?L6 {<<[A E<Q8$,(%$O9-"pK cF P sb+ȕx]@lG-QXUXILnLvbavSc_jۆm;jws->(Y%H4Xj%8iJtV.pV+j"8QRiQcת[:K[.s޶.Y6ƱM$~c7$pVgB3zkl6]r| ܭ+pov݂7>[[5u$OyZ*bnyV,e2r\(a~Vꆡ; e8u%ZJZyM#yW0Q ޞZL]F凁0RLj!9S1Wpnr*E K2٤OOP1ҽťE֢[L먕F !.2wA7$4[g/99`oYGp#s`_x[ZJcp7N19?<7.klicI*u(4LjME}8p{c89"0i5)Zܙ8-M=y~0+)Ya(pkőa4Xoĺi1whrzZ:.p]U* qn. hH2nod$ 3H0|Iks?(b#'o(]U L+`ZKІj‰dMޠp!2-G!]c9p*[:&a-Vԋ:=r.F$]Zs^֍:H-Ȭ`Eʁc%SGM?dX[SBv!'tLgp}UF0+Kc03~сbejaG6O$EɧV fdMM9u.|e~irY>)M] _,{{C<*68)y6d#j d[:aPNєj9pGG*#٢~(NJr j4%0ԗO,JޯFljn0OL]HQ‘}K/_/Χk (nhBQhU \ֿkʤ^vm, \Ar~֧*OE𿥹ɛar%̹<[nEy`̍W;+AHCKƖyS3bQ8yN>.=jsx>\`2e'Zm+2"}5<k$7,q0\WczDJ9';*U8'QU6Wa8=~|w|~?ߝPfN^ɻ7'|Iky_~}>@Mдilo47EJtz[I]nh1쪿%Z. ~8~7}"WV3]Y0q_AqWgC UR)Bس*RMv]";9S%lNSVQb F_@y vsOƾI3I0 x$1+#1ˠ-s~g?N&ӿO&u@8>ERtJ+eGaSVi<Ic[G(h<)bS4224oR)w&eI )?jjN+x:F^~=:𐲫FэNgG-w}P!uWx~\F8;𲲑jD(D LoV:d?U,&VkED)u* YS &xR\6pNSY:PnD9M$7 Sk yRMx: Bg܎QCk ώ("8? z pZWWnb>Ĕ! gaErl|^Aiu?DRěb!XBGg:)j{~(41X ha f 3bA&{nx u ?",ssOgÏ7]vD Sm#n!],Ϊ2Bl=j䬩/rA"̜GGgO9Q<3Y 0 BJhtKe;Ru΀g $QAq*uĹ}oY_:M&tμzlWwŲ)txͼgO,51D/⚏26eZLG6jJ"l:na@u(s0BQGJAܲ=|{iȮIq\X9e:O]In)|QWQݗ@0Gm:]Fn?Х3uoqW8BWi̔N]RfLa!Dw̴g=3=Pf*d:9hP1R$` (pB@ MH6A-Z L } ,P2Z(OY;+ҞAmgdu/!ubvY^4l67~8v|09 >3b+yd|k xy WK7D~q^dao=X֙!rRu z2]u5i;EO(oSs?!ejY`Kϥq8[*]%ͲB,Y`ɢ57j*|{uT5_+{>Ow;$\͜G6Uh꫷T\m$ۏ-k;nˢmo"7E(<&{}v?r̭O5Q:W.X(IC Wg_)SW_Tη f__~-T51mJ^-uP%8*ŀ$rПwsf^nJd3Q,glsŲn~r{gk{?'` RS >(f2)PG\d TR"*潲8kh<7S?W xC;/u&F RrJN'!k5\%}1H3g5v 9n6WqH>K]mXv{zD\Y7Y9 .BY}ҕh$+M{/C9, qCN(MFLfثFk/CPJk:%!*L$7$*' kIkq?~ԫm5dz_J*$$I$2"V;e4 (c*! QŠHk=sׯVV狾\ТaͶ] nNkf7 ti/dGx %p]f;8$gVS/K')4*vː?Dw>J|L%feL&=}w֢ӖQq,h @gny?qE}GL>f?Ԁo$-qC,5RK8f7\n:F mhԆlӆ|*ݬ5d4dT;ߨPru0e Ӣ K۽+w_&ahGio}7BSWU^Fջe4UY%^s_Quc^V^9#b2,3mI3n1NK_m~TR5-gY[WoQf*4fslTl٨-٨n ]@t}8<h]0WI ǫOW:qW֓<'~AfA&҅rMOnJEUՋ&]$!~OfȽ W_C &8Arٰ 0)@뢔L97&\Ky8+8O}PZ9{ײWc*_1ElfS 4%KK.AJJn]=˙V!)/ @ī)y\ɾTV[c\7 \ =_8_oO>>_{*f x&e-*_gS y)L1k(j'J2f4DŽH,ˤ̭%mg.TF$1q-¨mg?r /2Xaf7m&{&oS'z+|=]쩇?`6^AwW2zS"'ۀU^E[E'j\)ҬOkG]sh;<8G?1XM$h}J5␥7 N~ZLP]M # 5;?W_Fٱ?- z\Rj#8ae?[ #lW =j5_z 2=f7N0$ZC,>cj BMN5T\0IG/rt5]ߋ_#x?U^849TL%* $ArD4cN-IzR_cs=?'.)RJZK)פ<*ἧ{v39H~_&u>u{~X?jED%)Bs͡6)aU R 7F"lޏ$D_f"t۷ppSg1MI :Cb9كqjƞbuҘTBTO#yj:zc~xj\^" ]Nl' e%R{_dwtqVH֓ &fI1 tW0'@uJtr2+<ֱ+< ] mx0z#༕t4[x \ߘӷ ~>ꑮ;r@tek>\rPte(C9++";> 2CW72yՕ,HW5DW ] m NW2ӑ^!]1 Е CWk[P|WHW֨Vuo.~?ٶ2.[rc?h;ul>闟_#6tSsM/KճY{_bQ%{z~4; -qmBmRӦ΍5,3, Ǚ}>,?om:)TΦMlTݜmj:";t/L {Q]s iA¦m!>$sCJ nƃjhޯ)+LKѕ:_ nHBW6Ct*su]JCWK ]oui(c8+!`@ n:u\)]J>nytRdR[cmR{u]11= Vht{7Զ6,6F=rg6nxS*νTor<KNnoK۽$xvO~rq#_C)cb⋗:>|K%%YU}_Am}q7OLU0hih{4F"_H>`>?O!c=mvBqή,RWF׽Z``a8H_=Kf5(6إSWmlrEp |1 (suvp'"$B V,y`M?pao[yy T I;c}w f.m3Kk0!vAye*}pT[*7%鴾,jYe fæc)OIv 1!/ )t]H6 H$1Mwx҇4c`,e1G $/E-K5/mx;$DdH6y"˫*YȩՏ@~ދ n@63F`; K_+Cawߟr&wq'1֩L`7x-B tdDcwPո| 2ŪyT.w!}#laƉ.FAYK Z5JqkAF-tqδD(ȃDo+PgmPK`D[qU b- <G6 5v! ۦjrcO, #ied<u1M3"*OvM﵊,"Ҳ2ߪ$T*H |*-fED Vsvc(h 6cBJ^y_lWB6۴'gcǹ2v^Xz gA4G0X'Bg-"bqq`X{*] !dMhmfcWө`/g#V7*3_ wS"D&!Ԧȇ.Œ*[+9 ɍ 7ɤ)r0O=f/>"9}/V{xc@,Z$uOFm۽e_%}^dOSV8Ih sJ0L]kC,|< Ogl;~hA_nS @,/Z]ZE#۲vmoLY9b%yy#8 I Nq@$'8 I Nq@$'8 I Nq@$'8 I Nq@$'8 I Nq@$'8 I Nq@nIjO l A' Ξ@@k}I @LE!$'8 I Nq@$'8 I Nq@$'8 I Nq@$'8 I Nq@$'8 I Nq@$'8 I NM6 EІRM& SI!'NM1 @$'8 I Nq@$'8 I Nq@$'8 I Nq@$'8 I Nq@$'8 I Nq@$'8 I&n$:I t@iO>KPq@$'8 I Nq@$'8 I Nq@$'8 I Nq@$'8 I Nq@$'8 I Nq@$'8 4$izѫ.p)fz};fB'swyGyyX?#O*#tF@.W\RiENbnя2`-Ÿ*uaT]~#*ɮ֠+%ײ)z ݻ+ғ*GEWHcBJú1(BWi]! )d]MPWڄ!]i 8H2B\ i jkXWԕ KHW!+5d:H eTܺuRkB`%]!Tt׮+u5A]94vN q+ RJjtz*KEWHPFYWUp+J8DMFWwPu]0R }$+ d&23F]'<p[WEA= W<^FkGfWֺ}^ت!+6IEW@F쬐rgu5](]Ԅt;2B h )`]MPW:ZMJWl'+BS[W@d]MPW&,!]!p+q}w*B/pzdXW/++Yr=ck)UZtM<_i ;z Q_ o_-SrUv%bWM@]&sz'ej g}~.\p2; jZ|sz7;?nVI)7믴SohE7f;a4w 1ϫA^7W澱U nc| r^PiVy{NPoj+[lk̐{o{_+"r uۘ]S74o*Yo4-L-y+s6Ucir*>f*`=H!m)ۡ7 bK34&شPQw =,Zb#7TظUw2~&!]W[uD/:깺`|V(*BZkRuRgdt[I2A5v]!Zg+RO,zu,vUp{mny"wCyLWTjߢ;% +"PҚuN&+xop++8^FY[gu"J Ʈ؛JAEWHtBJmXWԕQDJAN q+ҩ )]MQWV0 *BڠjRF˺ B`+ quN&+cU`Up4V*v]!uK\|KڙTe'3\+eg3_+Kxy 6E%v2R\vx(*ex*h@Xg.6W/UJi^p{]lguo`'#]!&3"ծ+<"8E]y!l]!TtA֮+u5A]gD`CVĵUDJϺ Ezщ6!nTt^V+<ѕybћcӌEpc]=vUFG|[DEeC@WuoуRt/S;hXǮpȺ*5#M,CjߢM-%]!+B! i]WHY[u"Rh2B\ i]]WHj2BK%  *:+Uv]!$tEo>ve9AUXNJQa} uymJxƆio?Ӌ/ϻo^xyt8<dvN_5ixјžݛ|o^͛7P 3O7\κ/6]BЍImԪF uY&^Hw=\?\@)lUxmğϳ^]?~L}K6|ɻwˀ|wysK}{O.x o7{}N9Ꮯ..!>Hrf}O7()_̚I}V.Ic/3A 3wAO2 9S+panlyrw]QҪodϛ+s,(|_VgIYwp )N ֶ~1F Ȳi!m=BcH;US m⶜OPnZ]Wkwm5ovu?\,wѿ?Sæ2qM}_ {^8'gOYA<_\8lZ?|8|Y}'b~C̗U߻Oh"Qytqɪp_ܵYSηtuI^M">&^7¹

6}{\7<}M)Z8BGǾQnqz t^@QW+iFR@jhWlpTf|+7+CGgР;D>+'nZVCMíYq&X,ED~uꑦ^5)mҺNetM)e'DJ!rhJm]7Ak·>C#ܹڡu׺mz"*U֚4Nn kjh[7>ll)M7|#)|4,#orsCcn>k7ʓ7~9gY5@ Jukq%Ôq)v}t9Y}d6t{+v`}+ f ]Iu2mA@/MZtR Q5Kduv0'^g2TUTx%\CwYBwɸV>CP΃u:u>D%dzBKRoDO\qQ&P3@edTDNCJVJ9XZFc,w9xG?pȨ Rc)ǹ~n?77Ǟ-fw#w+>91{ eM1ýT5\g(c/f/' ,cկ/kmHgT ͞,`Mp0R-1HE,+@^DI"a[pl /χcd?LL:={fyɚb'-m~_F[yDטF HT22d\H{H)#F BP߮.~|lEOEȎʷTC-ԃ)WvҔbFGFqp2*j3R%Q &գzX'7Ǣ$dWW2F?z7Kfq I_.rIzf}.nV˥a(-('ˀ>@&Jj,jl4>c=Hx>mJ56f&NȁFG"):XZ㊵ؗQȨpC6D $#x੄#}@!DQI!3X" TE!ʨ;>MT=j\vJ*&3.D$n&YeY=(fF;>Rf`NIDyϊO <"R#A` Vg7a)){B[/QhaZh$ҙ(Q %! de_7B[S!6e6ZeRv&3xIs-= |&#U2VӖVf IƮT{g]dܱ>?Yv8O78M~AGh8}[!$N\X(Zq Sc I:fn%:KPȞOlJmK&ce1%S_kui&F1jג]l]o{'B82x2)i\$ )Dh5vFsqg@ !bф ٚ,jFMl!eX$R^r֝aK'rccGX]Xo{h*E`F[U0JdV Y$4humJJA<$%Tr]ֆTр΄"1`7bxE֝-ҫdBYMJv]]c/@Jv̀ B"KGX#Ȓ5d%zx,tjee{cLت dGtU/m8(-E?jg[4caZg?4cUB B3(Ch@рdqFA^`/;:cƻ<.J jm mB(=(y Ѐx0ʴxM?aNߦAn}ȕ)<ஜm̷0fckO0o#et=V(m%PklU6jTTV h K}Hp:!mjZ B#ԗ`)\5spV] ȩTrr~TYrt $BZe#]nD"fe ;OT>Fn#%¦\̞87nQY=?,]trraБf~S55K@eH)CiH)72mշ^өs>q]R`Tu4|a.ɶGZBZ8W6/5T[HۍeږvdLNlnúirHQ*&;h82]I K%;"ukphL fx*p%cPZy3#pLcui7˒>57cRr"_gq"qo MŇ3tmZ +FǒVeε2scE !D]Uw޾iZlG%Oas;f3 ^ &Mӻ<4PK^.?%qеJ,FnjM)GXadr1{B{sᅢxYKWZo. O7íbA&Ax̒ېЗNЙ J6 ^*y;ԟg/17RܭyRR$M8ut 9NΆ A HNh-Sv"9qMI\`H&UrRpTC7[.G.A/Z71@3pUV0K/:Be ft;XMe$V Dv_=Gtc7{_3l/G7|nuP*sq2j!PZp>>1g]NVj*:׭tRY C2X!xmRN(34b7<'Rպ%wZ1P-wx-wh-ɢlUi9RP)'mBd)0*$d4qt5-e@\K:Ea4 c dW]JJҴ,0RGJÊXn}5%#dMdTL{~x~zg$oW߿\oO߿YO43%᪫ kkEP :}g4mk5M-hZCͧ^ V;}>V>oKhq5 $Wތ߆XBIoB՟t;k+U8 +mRytp:[^i=^ԚKb7 }NSygG9ҡJilN5ioZrq F}eka lB K9RE_:~zdrW?sc:Z%ȐoiI\-H"V}9s$QF9p=wdiV;Fo+Fx͓2p%s+%gG ORsr?2㗓Qw#$#χࣇqVbxFS̢v9/w"jEHR>DF$Yc"D5w=g?~ZG#Rs" c0s+a{,6R515jv6fm*׉ y{TEA1! CVLL.j2Pyjpt\`w9=Jzm6c)(+ZzRպoY__N&R:bg)wYYƙvՙY ^E^d35#Ob`%CP b.Tf@$ibjN>6ss˞[we*!AI%@ʆ;V)Y]d %Z)ڬ8n㒛znsn>xaꤝuMVR 3Kic%ɛTIV9%$k4:)1kSnwZ$ -P@0嘓WDZW!G=s},-aݎ@ܕj_r4ӧ ٻ8n$WtXKk,0vq_V%ȉpeI-Yq#i.UO=EIO`uXB]骓CdzL Lm)*_M4!e*؁ oRel} &%TU(XJeNDM"eJ]LYhyRnbR;8{Fa­E 'Gy[czFX{Txy-._|;lyp;U-:..-VԈ[5?㍦Zv(ݡtV[^=d2;=WJv_UUݕ7Q-\׷~=!~kr̥j.3#e3zgzvͿRlv֕1:l Rw^O;oVK6NgI>DH}Senong&G$m7I9FTZKmɰݜ 8Of;4E6EKdzv~޽?.?_[]|x(*)SReC&v*D*$RVY:"rTOM7=qOd]LFV%ԚH)dWQUJ3If e: |?>׆L?wK9aJOMW/=3d.)U J<`iRrJ ,!sr)xc>@(監//h^~o6ϋۯ1)Zߧz'Moo"(wtG|{qƚ2V {U;c*_wXL,7Oa(}8-`\. D?uCYvK<zcs.6r~v\ѮnK+@ׁy_ZH4O-.N|7[|:Ssmr =oxsG2a]Z!g?bDқuY3nfC.l|~Xa`V kԘ>CٍXz@zRԳ|6??w?bmm$:ǵt$ڎthmdp\~p|V^ŀˮHS;"vd)v1QKx(<(k/9fo%aTޚCPɖLdrd59c֚:b>׮-.Y7FV=}9C/-e5F+[yvJ' í!~_W- ~ h~(}zߝ""%uٿ8i$ҮMqRF.P=S5rg_YinxAT>pdz[Qޜ/B/<몓/u4}OCwr`/Q&Դrzt|Z $ggUel:茮qWͩt||ifخ-埝_MW Uv9qXίb{sk>BheNjYoZEBXU{\ƋVn& -qcz0hm?٪5rë>Q|A}(ʰ ɕ;~@Tôj=}s7|6=韯cd[~r!Œ 5pZPtAdeV$E㴠"I;Vt,6xBjX69Wa3gqmH^y=/cj߯~~, ;:巵&9Ud!.GSB%*!GV&a6*ad1X ׃2( %P&Z,۔U'8kNg-YEv7;:[ZH+ݝ% M)x)-6"mŘ.j-gRj›-YZM`Rxm&!jIs?T,X%Zv}U4lX+SJEvNq5BI dd ]\n] ڢ%WTUiLXbOY'EƸеP2VިE;5ZN+e]C`D^C! %QapJꬃ7̹( `ъܲ}9 e/ BTbcS [O-,#.PWMڑVa aBˣh\@I#G!HDPr/B&Y5i*ZvA1mrLF:@=-%f0!Ю87! )X0.`SQkH j Ȃ@ mHc5US1`I!pUHy(訰YMb0þ@6x")(ؙ5dD1.EU rZVtҁkJ Ɂ; F );j`UfҞ5#!h~BX'ox}t{Fj F'e] % P٥P Lիb@!ѿLr^KɣBqBi5 ^V+L۽VQ=|cDE#(M'236P^A^$"j"kcD n$%!`BEAw3Aew -?ױ^2a}$>Hp"P G2^,4-,OБJ!oY.א4s*#h 7n<&# 'qaGe&!4A/3J* < F &,:?)*`JTU؝(D .+l$ªbv6Ϻa#- \D>P7#E6 ^{@#Kv)}h 4AJxqb *$`k@)@Zc22-B=z K }Jh 6iNB4AK^ ϠB-L`cšFBrF,Fj!+VJ"Aa8B"mR<26Ǯ6 3)Hd`&JqL ?g 8DDYõ{CIiX$H#Ad32 c}81vjf0]+ =,$5$Fem&gSmKoYED~cNaA-4-ޙq$ _!e7@F0~ $ދ!5#_ /IDH+c9S]]]5]̤ ]$D_&J 1ҹdKۅ0Ҙ m) ;5,0j][CNaɪ z;Q^ thhwftAnh+;ʥP4Lw`PT"'p'xʵ׃"Ua$ V! duE"bʅsE W@NvN_:@3 +gbaB !0,JcxRX8 Tm L }ճl] :1GȌ "eYq2B>5R&@j:]+\Gxl8>@$T5%i\4'02˷nA:y۳jn5}3H: P0"m0ۅEN'X0e%3:4>u@le/Ě>%nX#X5Ƈ(Nc. ,iөΆb`bץΣ0@] V,Y $P I"v!zQ)l`[cx,STT(]!ީq1yW a^rqh4]+ a"љJV*rey6Z*V)gM~[ƭ((WJT eVYs~- ܛlc =.{PA` [m Z\H P: RE$HqdD D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D @Y@BC|H o $=yH/Y""H ""H ""H ""H ""H ""H ""H ""H ""H ""H ""H "rI H 6C=uL ~x$H)@_" }@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D}$Q)='w=!@\ :I K"Dg5@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D}9$Ѝz<F[wSMMz}YO'߼NIwKpz]\#53k`l|5Ho9>5u9B")a>wiǬɳް|z_}y~Ь &Ub43`܂Ά 9鼤LOAe3ƩTpIU堾G.W:y#._&Äf uoy φE|x>kˣ$5őU&հ.*; >LMtd4(6,>AJ ,7gcR}u *\ՕJW]l;hQ>`w\#exR ¼њP4Kهmr}aQ)rVUrN@$1 W lWfy ϕWcK"fl@f m]>׋Q9Z J4( cV&[J0GV:lm`.b˰ Jh3^r nE:Z|*Skݧvtṷku밪warneOpF#sC'-§גAZEaw`\Q2 sR'4 ?NN1ٰ/w>vӭݾoy[55\X|s.`vQ~Ǯ~ 8˛4>$D' 8|S )-Zp?p߂Q&"\ڎg~I3pj?_/ϋbzz\U@ͳPY~}V~R_B p'K%FvR]Cn0Ұ?:Om>zqVrT reP|_00_ /?luނ ߻H>-opu%CMcɲCBq-BG})XŤ߲ǦОcG!D䈙3YVN``:Zz}]Ԁ=8|lq/n5<5Gn8`^+_ſoónsJnXS8݋cn1o]z.o͢?}D5.D,2"eﭏ$P7 IduQJX2v&ݖ;_3[ؙfk mǶВ-|P[6g|Y"Uo$M~C>m]6CaM$L:E-1XJNoQYzW')N-tB7``2F!Mil£(>b;|-/ʛ1$:؝sŎa^ 1J;ھcj{gk]9ͳB=R"sfK&",>p}Ɔr(Liˎ+,&a"(%#( \kB\agm~\0>ؙ~i5"G,#U FR$,)&҅u 2y uctS ZĹ"~RdΉJ O?;H,5,71*U2 UΔyHSW6ユTĂlkY~N]P1'r/3 2xB1sb}x <)M l]Ĺu8wO(oP-ޣNvsFW4-%^?嚈?1up62/x51dJʳlɌNPH&o'w.4T0nIsXSAWם/ %0EG4<恰^G"7OfWÂĘ=~y]s۠UWߏgOGsK>yvm<3]q.F u 0Fڋ(t,I)u)XƝ} .wǛ-3W'A3vSsCK!#0Fr;avxhrinhRAdAù {U{B >Cz>skKm){s]E FpE# ymC1u$ZJB,S2:T Ϙ\MٳuVg;-ɼ_׽§:(IQo {m NL!,cK2,WwHg.oS8+ ԰|q r9A|ς$ Pwt>`6 *|itirauR SUPJ'un`"#5{eOBF@qMAv♎GӚ'.j`ONZXznUnI x ];RGQ99^h;?ζgť7.c!k?gAn_<|<L{*.z,arR~o9eqyUֈ x{u$SzׂI|5V1w )&8㒉Kv|f\08|<8._ͯ-t@aJbW◶YWk7g~:mNQp7(g51eh ɼg_.6_c^8^\Tq%42v2LN߂G.i :e_3X+\wMoگkFkͅWaz)Eփy۷үvEٴy< ;(×-Om}қt1bcǬ+ G^ĕ|4|^tdEz[]6>U۟h䶧nk+DDvI`(9֠>ߍ1b دoO1v n*%tfa{砎{ӛo_÷7NN_} w^:ҷ֛Ah~7hգyGKvxQ]z;<ײ.7ΏZ]٧WM0kvpijBTG 0' 6LhTƴEnLb?G`{ }J1[cCN P(w_xTYXpz>8%wbk]O[KW>ͬtHWwyYiWi4;}ؗn`ܿ~m`c 9QBӧOUd.fPɍK{:oc;ڄ@Uo4W\[:83*>X1(|>t9ؙ՗$≽GݵT݊^PxTycChp:1L$=[rb Ν\hXh: cUnq:z_7h[FgХNř0ndgGP@-\Sۨtv^*wzc>.]0IS_F\nwoGtJ(Y+tu%qW7JGRU1&DuhRֆV2ng56z=?y;ۍ}`Tdr>lG:Hˀs۴R3: ηqtyS?gW^sstB@ymr׭}qdRș+T%M:.=2/[{N6.;~%o&uj燯NQg2 Wjx2zor!H~Y޾s2rv&ݶ( ~xb @pyq(gޅwyG<!VykO%`nŚ_50oCŹ͘ߪ-TT5jMr:&kTH⫷u>.aϣ/è^N1Z1 0`K)CW-4dk_9ϔG ٕ^י r[uoVPV&נ1bPtCHZ ^aBN6R X\\^Ҩz6Q;S2}YtP,uR [mU6G2:[ռֆFE4ed-vTALJWo^b̫Đ1IpI} ؂֞*w&K&8yc*8J}7?*luޅdvW840ؤz[vg .C|vd'C7^ OB.,޹Qkd\ (~T2д|uWYlRE;}t{OwSv,#kOriYU_?Dѿ\mq2QGf.añNZQe5(f*Ɇ{?t=y".yE;7J(Fyj_a^=a[n`7p\L]c|daӣl"kNd72&vl"y]CuFqߩO6ĵR7+b ]]LXy&BF.`6kJ B-*{:?OG<-V@&iΉu^.ֿܟ<~㯛"K78]ow)}_gm14|9?{,L`3Oaaz}$ι-K^}]hg"f2av!kEL?5,2i=;-I;!_PcG?9A!h(( Xkk,jV([ ZT #RM9IQzK=>}7ث>Occ;ƒUo87zPj9Ic+eJ[?SzW?M<^z$IQk @ 3 -2tIZIc9ف3+r֮E :+@rs89C:DZ&ӷRt%:2+@+d\VQ.*z8>9O qMg^~qo=Z[Ԣ.msA4: h `K1#^}%꒿zg.&<(hNekk. 'ɟt@U-* 4T s:#}W3֛osBd.E.Bך*OVBp%M 3)`*t8q=Pvap4UX, ;#{%R[YZOZ3?N['F{j) B6v[UG7z%Gz%S^yV.r6}CUPZT*qz%sqXLBn.ZVo+ZȐb꜃ejny+)zID=<=/RD.p) [bLN}{xa9-f{R~E'I7TUmR+ VWaԹ!s#4+ ފZK'%ȳrlunm~R1ik)@Zۀb1UAf9{o3r(>`-p![_l ̢JZ6>g`ʿxtǟM \i,[L_*ok=%&?\P-eM(?ryۥI/ˇ MqEhYOT՚QYZn%*hT($z.<ZLxEJ(QĜ1f4@mR2* RW9CUmO  Yt#TVݛC>QLg96l.W1[u¨% ."Pcv/ ,lcQgSيB(dGS`AGE|}/P;WWEkjM8:@#Z#ЪUU|9YF#*WNx_.Eg4fP%"{)TlNS IXi ޕWQqtQUOU~ԩg#<"*UV&+xYiDHVTZ/ t*ܾ\΃EEȞweۄC9D@Ik5 ň٠EȮF]NWƑZG'+06Cnt@g速 e6ւlC2̇:w|QqƠ9h(ˀ1䕂M*lz"fNd]rsw! ET U2f=*_/*1w޲AF?O T<*@6RXEA&jѨ/%Z]0QFZb~g#m}E)?Q?ZNÛ}^ l} טS{VN 5 AU*g{m~{{;ſ׹˧izq&;VH7?E )Pe5C#z !iD/l"J1՜4C41s1.j 3h gM،clɞXi3"1k:;&NOxqakC.6T r Jc^QvTL41iEk 3j k[H"7*rM.M,㩈%:gjj˅ ,o)hzE؋'ܔ<=U պy(e䰄} e0X Ve`sK8V\S`3 đ_g˸5ִ {% }v 3bEf (:ȗ V!(H}ic_ZKF8Zr ,I^n@suQBߝc_-flGy~y+Kc/gvkyǚG䄯.~o.%&9S]2OmP c̎y\W澧*Gهgh͘llG94(0JOIƀ)8<~<%'iX\[bY$ٳRٱJ"ytCYM. rV~;EO~m\Wm'~4Ź"Y_?-5Ǐ7> cZg{W{G]v6{Y`dq"medId8}+dԶ)Yv⯊Zw-M&ZZ$zg!qJՔ#ƚ{̟̈|_?fv:81e\M}Ӡy5[nvP6fVWz冫! #YbGrHMðax0pT($ wˍޏٿ-NQ?rM65"}OIlُKK$&!w}3Ez"~Ahn ǰSɞN?ǽ_!9᧿}?]÷?}>||z>|-:Ő˽5Mam7:c?L(94>mNM\;HuTBYh\啐Ȩ9e's<թ<թ=<9 rP$QOPAr+sg$\%WgnJI"dSQw.@c:Z0)xkAS"$JO5/M0Nz_6R%mvƆ|bu{~CͱE2U̎ _ u<;tB*;9 :#ɯܬB2ΪG dHD*ι,*w Bw9Ȩ6h杄S<ˡ, zgax>%.)}t{ŸnPx{92}  k̻^o۸ɵnA^sy82lw[ھl76=>X|], e9[+Z3evnm^iIʹGCZ/b_}ve2ym7yB'gBRnMQhPIa (iITIs%`),؎y0b5y!z ] æ ~B Өoox,OѰjK IBP 5ՈB2:HfY~K7YU&FyrfU/Z{~陜9#Jj;}6_K/_ a :kBR L)2d,q)  @ :-I^999?t@9`vNy!&UZTEnf6tMH_ A 3"HY T?qRip]~v\R@}%$4sTp.&kr-sDdEќc_x:%\r_#ēb<.&pb'Z#۵C*w3Y :iX| D]'@GfGFe@! >1?:EQ+:4Kc ^ZJ˸Ejgb:!4~-ew~1&2d@+WW?C{?8#ػgufIbr8DgY'cj7YNE=\lGV/ .(R `w__Gg0Y">m ;ǴMZOO=UʍRdW~u޼lYMRAxޢkhȑp7̦7h/5nYI4g}VcU]^?a|غsjzNWm[O}7e-v>γYZ*~5yo~=xqӿ:xkป4_6=U}S \ O1pVY_ T,;Y"M4hB%Jp*' 8PT=8yg՛pu28vsf~=(o)Dmoι}4T ZX$$BAn:/2^:tNi&%V8-GWs5^g81D昈\jJ A1cI:&(ybНD8Dȴ] `8av^*MjYLyf(N;H X9pD`;*Ƥ:_"C%~ٔ_|ja ]2eف-ӛg͕^'PxpfP]#x4x%sX98@M)#xFtQtA!(LYBrIBx'<(B^Di!2)0tA KWJ*TIp8[TFASb (aȵϥEK븾@SquZ4l[Q8d0thRE5?NYQ[#AtA,d*1Q Ů>A?#{zG:7=M]! ' =w[mY]}Z(} Ug%{QCӱ q_Ʊݠa MٳƐmƐ])<ͩtYb9 B ={%MxSl,')2\fR,?g= ]/my|(:jc3s1nP$1+6|Voo͖6=oOKmX j(rByn0C hc98O"w^kf\%YBP/LcN褒_&FS2@K]mo#9r+|Kc9"A66"mh$E SlI~ѸLٚAMnSŇU{YLpRHGr0AG?"K rb=KՏnUbXtȫ5(3َѠ%fB̂ͨcGXU>'.Q8JL$L6$,RE"pژyцXvmps(tJaK;;>NqG \/S'^["8PW>7G>x'/bsȵid6kCUJeƹrĆ dx#%h-*9.sPw*^<'Y46[A٬1۩Бc'۩ɀ1x"$#9}9$fjt*}u=5lÈZ]FIA7]_IZZo!8Gv^tջdz=p/?ek=-xF藮.lJ%S-8^+A{ՋC--굔(3j+vj 6l|u~9[Cod!5dk $=C}8lJɭRזR:R STo8Xp8vYZ" T|JZen U$>R&!:Ad q'構4Zsz: ٧DDX|r}A54ESUDb4܌T"CA9_&lNRrVaik1 =)hJ mқJL U5g?2*հ8 Me,4=mUa5V@kE\Uz|X~툭ɆPF:# hV80B1o(I开OuVu{9!p l mAKL.st:ǔ IlmGG0=dP8]evj vgz_JBYeVpXpHpfRr3'Vh(NH Yici$ !H#L,j^j"QHFu2jُQ [T8?[Arz72+x;8Z(e*y/Y8J(1@Pbcaq(xx`#@؞|NU+oW2VC7kUOzK#݁*P3y(Sww!V1-ζ뷞?|4 dJp(͡.2Q,Z%F'[Y_넊mߤԻo&dᇿ~|;O'ͻ7}z_t3KT/]_z|>Xb%?+a10t<+.D[M&hrRz3JWn%SJJ8'1ҡZ ؀Aab#Րa~P~^I{;*w" $؋"I Ď eY@hҌ`2U,.yH%lHeڂ2%j iN?y+Xd!}fU9=Uc3O?ݜg}<#2wL緵X=/Wb>rA/CQJbHT($&g-{. /)kSׇe]E̮YW0U_gvPGsdpP#ϵL tLO2mgXe\'p^5يS)&ыFL+.>j^dY&) YY%Q87K KH8f-b?ĊHяW-4[hy.i)^r2j.q@EvVvήfs iƨt66elL#GU"{rHC?Gg~t̄"2Z?t2\łCfxe~n)/D>jU ,_Gf|WjetX,-%oqiEۈ[?b]-ͶB*Pɍg5_z+GJtG}Ϙu^\WBwuDVwĕo_[a6:J3x^&T";}Uix5d+ (Ht@nߨM"9jv+i~ R|IϠ `MEZgA{CH%%<1 LrXveT\ c(`Ev Hd,1_e^ؔڟi/ZNs0rrfKs24Y Qs<:k$8Pz0+1>(S-0ƛR9#P̩/6a AHA <nW^yEOTxm-vPw_Jc3=lwz2W*!f A'{",Iт(xtz:g@Ā1jmL%T1払K.Y2X!+J&KHF 2Ϸ" p2r|^+M<ˊ Jd!gJZPI1bX-DH( Bi!|r,aWwQINևԢc5C!C$RT۲ od%T^ ֝}0Kg gpyC8ُ]@e gu6m#~8z"t Sft'8^IO8ERGxב]֥F+M}M [aAkQ4-5c~L,2Rh-f>za-ئ{q= gCR}B?.ț:C|zޣz M5{c[=nߛJt|o#G<"> ]]V^v֮z9qUj*tkmyꚞWo98+'yJۃ4k fmFϕv"GN/.J̳W'.qgc ybfo WhJ슿sI2c>oiy5; Ɨ./*WcXs=Ma%֟¦Il!!'פ=QXiozz-]jt}z{%g 0:b$TR= K^}s]si]1b];d0QKC%c!*|!#*<,D]vvxߨW&S[ KvA,bօ,J2B8/+Bd60)`1FS+7u6g-"P&hDH)<ٔIơ6jlSkYww =pmMvBvt' m}P;:fvVu?ˋcPk=uW\zyꢸC]dIHC A >|ES 4&<ٱ4&$Mb s.%2 {%)^&#y$n0M aQ&(-){ |H L!4JEE\ t dQJ?0Kmk]^PW9AEȎʷC w-VsG+xG.dtpJg dP N=XίE;IȮ~AY]XSU/Y|Y&^ӅXäc綅UIK9tsf4yɅաqah<.>FZKD(]IWJkM"I)#$PTI`,Q 2H'\tAŜB d9{[@ Q=?SSSښu\wC* </bz(Vh$hNeSSv{cfQE4*l!Di7:@m3RxH)S ad%!sVXQuI'[J%s:Tfr NW=xݒrL[5;Eg+R"/AK|wB' !)UVitVg" &xX(h_BM<ْsQJ#ƹ\H6=j) (R0X Ca#266]mml ` j =c_Vxnɞ ĵ,- n]LOO_-a 1f*72%XM (~ f兓C*~@m<Y BB(fS F}pHޔD-kѪ;[x6`v+jVV{`ltѤ40*soƂ줁 ,7 h ր#vq21mlH=C@F#Y5D2KGQ| \tA5zCo{P+]\kP,b3"j"j1X"IVuMM)I HDbJޠa]7T"`JaۺBZdlLb,EY92ۆ XƓ)1hUm]Uw?vq-bͤdW(E`x)=/*lh  ɢ[䅋Ki# S_k!IPq `(f4Ǐcݯ Qޔ$*݋Wc;|nm1"%"r!#%)CtH^`C)*Ηu/dJ=BĤZD -O|\pQ)M&2@^ֹͺ@촖Ca*~N#p jpJH<:G'(*ٌNh6g#xzD-ϭuFP@\ř__.tnqV;s~Od|v1ۍ@y6Lד;W(Ėp[[B}{KۚۛQۛX3Mz3Xggjgw V mGI ґڰfE*?#-@|bJ;njX&x8=J,_~zoo߾{?޼o~k^u30Tw mlڅ>сO?|FӮUުiMl9?]+ZyKqZ)߾oƳTs!V۟ɨx\jyW?nhI[^v{ֽJ7!}mqo۸!OPhH{JRsD`t]6Xg3kCVS)Xm%W"zp1/u!Mx]5 1({W 4||xΌCmJ&v6W))~;%Cz}of1tfůr'IK67{fZo贋SɼEihe&\(tɅ.4Yrؕ13}Jb*^P ^/qIZyN:HƁ0 ̮`Kے`by<Hk;iΗh\kKFh4.AR8X7o''7?h>G fe8~z_QSW U*H ~J%\:U3(2BP$ ~[-?\B5O@Uu }4BX$(Ć7 RxH %R rLWA_YBf C$P*h Ms*C)Ƴ~9*$u /b g^Un]⽢}f(ɧ Qx3EhE^ȗGJ8Ys8X~jL!6EzL`C`6x64@DkT}1M(^J4N@8VQ@CB6171bV3YcB;٘ϚMgK>ۿP[ޗDb $n2ACd,ՈȺ4j,X% BPo!+*(x3E:)$xɐ!xh2SJFR;gNygmJhgL`b $su)%G$,Hثlf4dnWVM.WJ12 JNPJ{RWTS!Jm<"5p4YE|K-n*8@&*.|=lbPZNg5u1K>x?6Ca"tDDJ4}̓ь:{ohtӼȽt^lcd{}]_7)Oݐ/}P+qoVVU:@gTֹ+ "Et_\q'GlB&?݌Z`(%T Tculk2(^C=F֎U:,sKr2xFj_2vK{_[r. +":D)>dD +Bf'QeQG<5Q>g ZaU dHчjU"RrT@đ}mͦe;[Ey(lr- ׈;;%EdsB̑ɍ6LP4v2lJ!gFe%ao9u_kR!fIfSN :"acŭEYOx KY=dg{ЌNv uJj̓Eon9`zcnw3MJeTc$NK^ek1-mqPԲIz (jia$3e617މT^&K%5YH@=9h5J9R Y*_XHͦsdlUf,ƲYp5l7dk߬l݁kӋi\_:?|Mg_6lCAII MI"39Sc`ܢt¢S61> l2Tc'2XŦ4 F=uʙsaIb[x;Lsov+UjR R{@Q|2,5Uf5$GlƂ ٙT0$=/;2`3C6cb !(#˚E4F1F‡TtLuC-{63>;J }cGE[ xkaP ؤkoVb;ΐE@&etFuC@EJxDt؄$ŶqJ+Iar&b=#[ҨHzk t>frq-k|ͨdW" rqvtFU}-M:{"c%J^l -cŽ݊:v<ܱ#U l.|ȭj1ަˮ7C(-~4BIc-ڦZ{C \Xd }g@ t ܢz%RI84&FjP&H\" ŃƔ"JYtDMJK4ڣ/"*k:t,Ϫkhx>c=>ő~-;1QJ8;l!S?'VJ&6jBhʉdh"DfBZE0y0} 6i<rGݟAآ٦s`g@I*S2XU &e 6$'Rf\>5&lדW9+uYQZϦQl-)<1QA1\] Ix+8> 7*H"He,TȂXtt Ȟ!+2w5ˏ=A1IzB6;;YM6=jTպ(KR7QԞ-*`U4[<؁8RM[<@D|M %[No#bn+Lb`K "ȬDЮ& 6Vo0=1ꃙ=ݗ.;߫0~zbzպIޏn}8~,uSqpHFJO8/3y{+(+\ il0]r.xA޵=9P(Ozw Tor ~>)M|1: YKׯ_͎O_] !^Ӄ]<=EtnhgskهK(Skϴ{4zi];?o9}\\x{W9&e|tq]GYw'+!68G:4 nFn*cFLyr(^UL˅^yt|4n.rӨ-"i=Kk&S_eT^N8_1R)-_j&1Q<:_~Ƿ?c͇ן>@7u+0˱4+ hxrmyҪ-6S/|qI| ~>/7hJ $/LGXCI/لY=ԚlUU$#Z-[m4pPh(R~RCȓcJ2d!2H҄m  8,e^׳ϟc()^5T} sE'ȥfb[_`4O^A+ر-7m^C`5j{,-(T霷SJ.d`VVgCRːiUI2yVtJ ևv̊A0ڬAY`0-*J>Ն񙧳}%ב{iҫo`\ߞ!%ҧLT:z&td ~,$ I)C3Rj!_RDh@F_G'쵄m>C;dNvC0*>W퀴 iBKdtdr2Z\*!)@@(QI h⒊6ki4RQ%&/C)@:!9E[fea!ly~ dYi|?Bf%yډy s #V:5q}gO^MRAy$Ÿ`ڪÀf&n&],F=%eR!)AnR$ZTaaؒ[Uqnխ{mIj~RӍT' /!*Z*Ar<Y|(yt W;t>E/2:iLt bQ`.S@KCؓאWT4 N8s&@LفMBI7-AHך隊o@\,_ͪ-(^)y N\v8L 6>mwo6ƢRrRlA'KvoTVϮpMAҖsnJQ]@fS xlg(h]u=ҺjZ9[leN=47zn|wm=o<@맇cB܇{϶;:n"e}ӹQݰz}l;m'׫*RÌ;mϴf%z]̺]ŕv}7iЎ\0ࡳIC+/Nq>}nU%-^]o/AST51I =!vJ-uJpU%:X[ ^jg&ykq[ktrxZ;/>Nd͝7֦_KZէ&^BsL$.5"&@qE*LPYSУFxPxy?wFtAk祲N̨֚gәG2XLzR*oG?j5sҽWj $DwV{?]پiJ}'TVbʚ9 D#C<$M{/^I9$aS.8}oX8N3jWy!(2;f q@ &rQ %1G3)p݅{ 8NYs|R3O#-p)L2b 1JXiZFϥ%K~09ŏ<º -:d/80tQEC{vS7DPEF76앫aLXM.k]Si(v?;.Z>K23 eL81cB[#C)>h":31%v>+y'Ӷ, om>%NfZ'뗶x;xA}!ewp "^Gy" k=ѽu4+LWXo%u e={ qCխ0d֮~-|ehRigXRu=K#(_HzC7cbu9>z12;{dZs;D`my䑮b1jvIjo߬tLP;:=|m:COUlؤT\s=uaYB;ͱ;7̭-g6<חÕD{KJ9O\VI 2I\М':>>QD+΢gd<f#}"$y}C p (]eB3.%i%P8%r.eB, !] -@H-les F zQ&㍣M^A{xx]ߍsIM)PMAoJʾj-k\535ԚT!yZ '\errIKD2/䒴z) eK,| @gZHG5Bi,%lb2iL{i)mlv ]tD}2@M*S]]^1 GYH*P' ˯ 2nSK;Uz|ا%i :2bLu4 ÄW Rˍyj%YyZՕ|2 :hV$iȕV^z[eH:;%7/Z2̐z%.JkR2D'Kk"B8UΙJy4ZPZycaWP;DMzͿu2jK)Nvy~z81lGBKiNpvMxw% JM˪x-ŏٸ[`}]1Bg LL)y}y[ a.ۈLr){c%XNJGg:+jG9~Pic4(#60$f bڥlM2n$@$>FW>KZۃWŚ$h֯m3[']} n*uBL~dyd(  IX9#5g<ә٠RGà,JRe.qL*u%' QAqʁ%x0r[q+R+wxϾZCn)S~Y>g)*Ps6F $B. [EU:T9^ESHCGkFr-Gc,yJx j%c42@nUdIpTg2o9Tٔ]cUOW\+K Q>ȳyA笸r.Ь*RVzG9XZu_.սId2莞q-:j{$Jkr5=/+'pXOnf ًo'?fjƙ,>R獋8u3L80&N,r~R Z-.ƥ)}qӴ>,EbO4#7f}(/Tgbu,G&ˏ:yhђ& Y5knQv!&: n=݅IY~:iY].fFs &ӏb^.\q'%gcvcgOP_ R:\A-&-ÀUzX}Yb$VVKQc2CYEb^>WTqWwfeWBL*ι%.(檸E.>z띂xêoAy!g6 xRS%eW6d R(]Cq>4ۮeY5Xտ0_ԩ훔Z/ OuoJt.3Ƹ0~tUsI~H#wK~Qkyrr *%:#LU~" j3?zcͱ޼k1|מ5 ->(ZJd\ R_ڦmn&y3-߭o+Շ[zkS]I#7F>.\ZڴgM~C9`{kuLt:R&_\޶%^\^cQCu6tq-y\{ZquSSni™ B*vp.LW_:zpVRX_ ^u#݅*ego\[d-7{Ҍf땺*x0nGdd4GPErErMٵGLKLƁB8k~nNVS:-, >L֑&DŽ&8Ι49QextFV(NJ*W4(qT#cJ7y_ڂ -Q?v+'oM`5yMpeG!fH&{ &:YԆ+ᕗAF*\GD- j'zM1ΉI&>9+)R5ʅckGMdS8))}Vj50#Ԑ\JPfڔ"VbX*_aD39RYLuR5\ܗӧRK/,eq$-"2d*I +.L8ԃ"!* a2f`x n2(>TH26Q:A<;8D1d5[)'5^jjȬ.[ M,;szT0gw1>$,4Jrs6:Zajrac*p!~ w:Xq܅;dbq 9FzpIe6d}JAhf' HU$bBϑSXm5$+%- YKFRq)6j-\KɴCKj0)DBQ3b xeR11i5-4@@$Nb!OTQ>] <n0LT"Yƍe @rИ^{EDI,t :`B#DgK4n[h$(#N^z<.b h!ö<");GeM% spQS2ƒ>D{ M߂T*eK:s0&Rn d#`BvQ62)!q*7͉C;,!"ЭBfGԠic?%C N &$t8xR` 3)J;x  `L pPYI N/ ()`VbЙDCy[p)\6 ʝYCAO޹q\ib/!p ]#lbuS3 -VϝæHZ6 i.=NzzU!mO<ݕenti)8QUDIBQD mT$z^,.z~Պ !q}UTX144(5J1 e09!j2 Wo\+1sՕb1I |3V; ]>E,3f̪$4_s ORvN&x?0ȝxa~4-#?^p<irBGϷZ[XǡmZIh|\ ƋA G]*p!s Nz-$] &W$ ZUzr ;1MF :"/Q%)zA"&LiYa JP aH燃Um4nh=Th"@:tP2 7AV /X:;*P+ɉ⑫w%,ҶM|YI I;>wK|Vs'SaA[w5}m6@@F8a, {;KnCzxC T&o%PT@5ʪ+| $$B; S'X|QBGmp B;k1#-08tj )N6gUmQ54Z,HZnkT/8tI R9vih#: gh p eBao 2Zdx,p0",0Ljt#ϢbaDYl9i?P˞`y!ripJ gB޼jXR\[oԽG" +I1"ᾠk{tQBeڍ!KmhC:8E;Hc=[Ltv9RigIY\ӜImp`WH7]pT3 L[MM>0;5Y:Z\֐D碦s8,4vfc\۳+9ڣ|QeO&Q;%FFw #TP<*4GEOp'r5lOoQaݪIU!C|?= bʅs >HbBvC&dyF2axaBlw(h(cx)U,JG53⓪֠cQ=g9(T \\cGܤ "ev819E@Kk^9GZF |,%o:DY,J@LT@$4.r-c|1Ώ;jyѦ{OvMgQ Tfm@Xk8m@\`vveY kC0^ZD/aDE.sM?vksڌ!$jU.u\v)&YٶKC>GQCd.]XrA.ނ0JTH] Sq1AAXB;Z8^nl$a"̦WtkhwTwyq~Vu^Rt^i:MP/uǝQ0w/N;Nl!o.@*%_!4O->sk͗Klw-Rw08~}Ꮼ3bT-\tVNpSu9=9ԜiGtU+ ǩ WSzwU4u l{T+{h{ms{ t0lkܚP>eZi[vHgyNI;osN̷%܏5 >Ko^n{cfVfkI6P`pˎbV &7$fP7Z\ cp6}b =Dyě}7' ^<mInfo?:6'D,FM]Ўw2V#{뻢' ~1'뜹't˿Gtz~ }!o>%ot>}sh},S :O6'%}[T8oM ̈́ >á^&AYUh?zuQbOϯq63xȹe zx-=_w>[I C)t43 ͹:ͫԳ5,=4>%W :BտwC\b竳* GW~^ۣsLtr9i]-~te?<ljܿ)v*7Vۢc{\vu[.fw7_Ʀ/kSԴwbhclgkl4:˽s}]sm.7GGE׫Φ͏_&fp#+\_'vm<2xlS!l_Ź_:o򓋀?NϮ>n? Ab+-YI ]R-ows: ]Rb9Lm̕n[Jy_J&]L-+xq}*"-Vgx)dUN4R>mH7N;U3sCk{@^-;؆и!- u֒ou[K.&?UaJ2~͋7ܸ7\7~j-s;ˍ&Q:MRn [SIH7ۖrt s>[Rot!Q;݈upEx墋4=9hvϒ'[Ηw[^yon=@S^|RJ[N.>X. 5x{wڙΖ`;T}VxV1[q¸ ŴVZfY[( e!%h_C2ˆqpijhqh-#BKIk*vnWMifZ|zOty6tm+9{JՐLn eb=+9RnQYzW!FPF24`2F!E+6T> oǜB H֍\G a1_L+;~jݗ誋i-6dq9x!Ck𲷆VmlH- eUC%z^dD.G(be\7h]{S?X..NsiTٸqq֜;g_mj<_scz{ϊ{\ 9_Fѯ"8ZL۳8̓%ˍOni}V}~w9I"mۉD:bqaQ2r?Zy(IOQ0ЙO Úy$Kj_um;YhiL@MݧDvUR}wSfAzA ^k*,MIu`(~fkv#S,~V?۝*63#oƖAC2}0;i׋܎^D{'?Ƚ!?.^9ؒz%>jv/~gAlD3S%.J:Ù[:آi睆r•삒Ɯd{323CL{ZXosc:.6ƓG-P/Zn7z{,Z|n}=_~ӟqjpҗ2Qs<,۳=:bTlM>yw2u8~zv.FM%o-{Cߒy&#vڿ}'GD =L 2ͼݕ>]ĐC~Oѫ.7 ?#4 ?9R?l꙽8-@gnA̸p<()%&TQa$<2%C%B9GcJgo5},3Yť[e޷>st uv8R ˰y:yJw=0'TpP7sn$_aW!}k_$9ϵ!9D֋ s̼҅5zaYiFb531I턶R,Keqᔌ'kOL\Xk,M =*_̄>rIpqتwr?mɶi>]{@l!:v-`E*|H0,Gs< ZRP$?{F俊p=6oX37,Y,A@I[YJrϧbaIV˲ܶQSd],X5^*\OK*-^f#r7FXnj_o lT謁"7 yI Uv f(b܋4LkkNiKoCtzd!V,]:0ғ^OR7T"`e.v<؎_Khz!J2H{ U4|ɀE_QnHncr1HłHJP-iE!tl*%8[mscȚA sť|si? M-Ը^HS]=54I$Iz6CiT6t1ǫb9\OjS])&;䁒Lm)5-UXRqq` ? g~:MWaps6:KW--%/2::1:R< 2cSݮfETEi8;V瑏).J׾1MokAX%Z?.jXA03< $-sMҿ鬸,&oԼFyۘӢuo޼韵'?^_.Yg;֨1Ws wk"bO׳W;H?JFr6$Èa֙-mĘZF U,'h8Y.vUor7Qli|Y`uZ.rI]zJ+EΠ7@ܑc"SMG~3F3: ƌ4#1 ڂ ,81(vS`G|,+J9%,opYk1 bmF8$t!ݢI82=~9_pmb빿]AZk~&+t,boko}1ZrIc%yN`&kpg?O~Z9 H!c,i{B|ōOÄޱ$)*|(&/|}>?.;zg^RZ͘`]۬y.c!<_'u0&EzdyT"Q2>DZdzf2wTM<1`tr;v΃OBzBo!Aͩ\6UFvUȹY_>FtϼzN{p[lj+:4K B5a#LpOl`J_[ed* ^TVCtf.;ݲ-;uKDM%ʂUR6H*FCeV6+H@ɬ-;ݲYzƌ[|)KaccRDMDDU⊘#:c*7u}ڴM%B] ̵ se|C:m>ѭi{4MVX[:tWOxQ`:_}V'DVDs gaf `R$n\aHUI9)UZOIL#`"\ISet&|Y{6Amk䴐A,3F/:F`ZqY.pPBM2deXFʇ&U ww/^1h|~v ϣqs7|:E_<\;év78;/ʧ㋉}ĭb%YU7'PFQ:B#(*vϣ~%0FXE.[/AA<5w<Į|(}L} ~V>bʞ-^DD_@y>UhMFIEM,h RVz,VϿzONj[UOcޚxk.lzh^S 6imri0e,mF>Hr-ݴ4cU2*V+ JnߨMUeƶn+i{5h`Mơ |#=x1dAK!sZbV`Y6u0xԇ)ݜLA)xl?̯>޿hw՚0X$Jgc@ͭ6#x# -e< O# /oWC'12c O9-0ƛfQi.6E|)eٵg7 1t`7+8%$vVjq=b\ y >J}LpDi&='赏2 vh2i= 1dc*\,eQ|DJ_M_oMhѰgm#Amͮ'ps8-6xFw \s4pJLP=3Z1pl"Q)>S;V*$!ײ 7"ƏqO[Υ3Tل3Amq)qV\gnEb+.| =@ p0ez~-sg?b/},~2V=UVP2, mMX܈JYV944a?j u ..1;d  [`vo^,ӏ?o aao/swG>l#G|wBTn_o\8ThKiyjڞ;v=Fqx\BnsM[aui9wzh)THa"w={VM1 nGƍ ,]nKXF3"mzl`[cs:3:T<>s!7YW۱⊥U>Gs{!ɠ޲y<+6%M*9p]4Kt֊JzXI(4 ]?gn/}H~X:!  đSѬmBy3yKJJS$i]RʥO \RIK3m*n{\"cP΀ґr$1֌YDž6ea L M`dVk1 04RB¨Rm"H<087~{JGwCq! >I+Gcc`Ljbg-f%瀒Ɉ۹ܓ0,}v.J 2`#L6`Z-9qc<^.v XYjsƳdcOgU+tz'cI0wmI_ (Av~?t0lvcλA?ep(PD5%iir8zdֳlPZYY󈠦CϠK@TJ00LXQK]2TJSo04bRe TEg@Nr&jXL  2 g1cVZ/FR>b';ixi׊S$.Z$59XhGa{ ς: BN(aG8\2S%;%:g~OFoD fAEࣦ6b3m1`aFNy]Ѷ|$)9&";*1R}M:qcFwT֪K$*Ԍ; Zȧs2V͡hG6 U;pfG'e6Vg'vn \Rk|Yq0HГ> PƨQ5^azvÓg I^ A5B4łD%$%[x^KtNc](C72ʹzF=:uZ]+dY3Q<աa9$LA0aR[:1>"X33O6 \Tq r.UZ½ I#Lu:TIb@j t0a" flEϫĄNi>Vu| "M]utvU7MS?Nh|ѿob3a~eo_D4|R|ƶ/މ F Ǥe]r0rZ}xDR~M/; -N E t|#Ǧi¹7SϏP׳ H hBR1Fs^:p7 fK[?:%s;#X|A<@Qn9ɧp\:O~)wzA_M]oG`R8MN;ж _ЀT530X5/$qn׺ſ<) }=1CYs l`O[X)a' T0Mu6TMqOӜ1Z{ *z tݧ'O)vA25X5K[^On9]C ƌT-8(t=jˡ"`HF0@67XXG9؄ A2 ƨU1k[`A@R:P[/5^#r{s gKbv=j4y%a ҁ1XQY!jBA42`+)Ѻri/H(Qʷէ8.Uh2Xb#n25 DᡬkF^Q77@I(wq,dJ@&ZF NAQu,Қ;-q׆{QttlfrTs$ 1@/aPz~\5f|aWC7 w݌GpzTڛp>Z~Oz &` =Ӄ9.שu3d)\1X=\ s ´5q[<FۂW0_oIyj=?U ?VͧAےtv|>Z ~Pv.mO+߶qJUx:붂Wmf8U'95T߿Wuֳyq#@ |g()k!W?FXʚ9Vy{/cg-j 5ZrV;AfY(Z)N?_NG9ȢZj0-5vMwQ'!D u.:_Yjښ-jm|S/#WVn:ܝ3ի[7oMߐh|x'DfsIr̘o߅E*{6)GJN"EN_ OjeIAioRU~p-33a$-a!7Ē 0bK^ @z5u+%j9,' nYՆ[:DJJC4(ץ!@iP ]EsKCg!@iP!@iPN* JC4( !@iP!@iP["؄!eiP!@iP!@iP!@iP!@iPXyiP!@ѵ4(s?k_:OSc ,|[69-jAƒEL>LN&n}yۣ`HmxDxtS4̸9.?'?ٲ(id  g wL+̋\)qK ;3$ ??NI#`6[uû`iNk*4M=q_3j`+i ?7B}^H}v(Td2IBBJ";S\1gMu=~.G5N_K|Xߨ&WG<ǃlQ+`k U cJ#68R$#YHuc.*5X@Uv%6N,rY,`3V8>GCwa#~ds;%-pN:`#'ߒ[ hl X섎M L52M`B<UF-zg՘ye4zl" 5Dqזm8]> }5Vmљ5/ ;`I8rZjq EYMR#Rh 6>oia W> ̂HB9r ˽)YQ glI+x@K$e6prN9wYs>ZRS\-J)UPP1g&HFlYJ6,&b̌`^p>lČW}>l357Փ]P)}A3[ ETDl1&xppoajfER1x(0gag&^2űXK0c@0I/cvs,&JH8 $K@RGTSJm͌| Bu1YבcTz)d ܵ]IȽGo%.ni="&+]TsV+vx oU7[-H}~_U>{jZUL.'S类%夭FkjOaoƗ/&㫁;u'U"4U{#wddo6W 3 ^P5 ֖S WAs_j⽬O2hQ5H3V 'q@҆SAj>o ~ydh')@NK)pC9 > 0 8^&NWލS,-6ln"|kȧ/7h4E!:$zD0i(b,KTZǙӖ8]7Wy=M_i/^waw%/䈧a"MOބbk!ȢZT,0k\r)X4{δ7B噲xRsbٯ~O^)aR/1 2)VևԩYjʈhA #((%zSUnz;=:4}b3YzVQЄӳW)lrQ DkCnq 'wP^9B뭍)ݛjFV:C*8e\v}Ӡy5[nvEQKmͬr` %KLkKBno麭 oofuōa8e`|DO_g'Xknu1ȶVƊH߭i` K\>ߏ&!W}3EzY#~Nho4'ǰRɎJ+{?~z{ݛ]{{Ax?o.O? d.]: gt^G4mj[4͍أi%u3{ICni1n5Zo ~zڟl yjUKSYq?,_n6{TQ^)=^X0 Ц.=&2_i#}N"QqD^q%JF:GtJ+eG@oSVi<%V̫SԱx5uUn 224ݩ;B$g )\tr3dfv&)^N3N,;Unbtfڂ 0)NX"]B8&bqt`Մ$&r(+OT::<Emq:fW:>1X &ly ,!0Gz(#9K bȣ8ONv=؇#U.Tfha#̮eVtv,sJH`&v^Q1K*oR0ooh8ؿ|ӫD2o7O{%s^<0V9j"JhdD B$IQ'T'Rqalv_6Rc%х(ie6FZ"-wR\90ist^"(uK%4%ps*t;(>:.#ם-{^,5ޏLB5O)ʑ(,I+C'/̴0S`LKԻ7 Bhcj3Dp` ipFv j5P2aKj}6{S2Z(OY;+ҞAmgOPt)D} 1)Q|2<$"Jm=mF[|󟄼"Pw~lb KVjݿ]{WܩQJ!Q׷x8nym݉WHpƨw-bweR_EaW0E<6M,Q쩬Dw4`@4 ڱP`=2tV9ih#)΋+ջbbpҧ{=#$zM h'9K:`Gkb@bTY+0Xԇ)ݜTdix(?ή>޿hsi!cԔF 2kL Yy,Apf""`xB'4T Z3Hyf(N;d\7^[-;euef%Ă'1`}tf됮D#xi{!`rpEm*3̮b=Hy!(Gdw &H›}Gl$Hx,+\/W/&8ΟYr|^Id$D[jeL"JX)p-Rqʮlן`~:f]-{v_p`N= u.L|dGx !7pS~;{bV *foOf!.U"At1Xɜ(:6J!&aLXM.i\Bx[< Q|'TbLeńkBOi pO{WnyMɶs@~|v2M߈؟[867k뽅afDƿ-lKml Z!iCʽ |s 5diȨ|zGCT#_60{4a~ooss_F~hG"텶U+^ն.ֶzn׻eݴU9.rpWikЮGrF]Ʌhr[9 Wl4׶\"q' -([%?WCG"*r߲q@Q]F=%^6%g2>(ͿNbu3o$yOy;&f]hhݪ 9H `gXܡx,1Z&h/=a7Mc>/k(ցfQţN*;х֠hEՊ|EsX!D{KJi@N̜'.)Ђ".hD`Q#&hYKnӲƑYtt%W xN;f\2s@c.u)tv  !@HKd. v# :hw}~V =!.P(+fcYw1v :]Pme<ٍW1[MxvX?K'k g« ;ǫgt4EGxpsVFz!RV"pV@ED *`^)hv C,=:J:$ʃIt:cG/:@*eNj QG!\8UD="h%TIn^bOǏx&D|gqkه<^}~88lo3[j]qkԪm`ۖa0sh? CFCl:rK7jy7%\\.p8_*-`ntO EtDy0d]O8$&*y*#Up\Jj%֊(R>%(UHAr WфoCij%q.]Avԕ^'mmT%20^L+Mx⽳Jq|LSZjwgYnw.SūjwA~r]nw~Pp-Y.Zɢ,Zr]nw.bv1{y1MafhXM8 P#2I.J'0tRYg' ?p'ʈ FPi<+$kRp%dd=FF>sԝRgÊZ:̮|NZא7u Fbw'T?_/RG0MPeyPz.; T  >wSj?>D}}@tr| o̫ [ e@:_2[Z_u\Lj:wB(Λl6 ΋F|mqBŝ.R5Bmv>-ΘKMG}ݪ*sfwe󤦳=> c mBu"v͇D'bm$SK ]=ѠMFSo:ͣ肝k;zMM=ʷKt(o|\rB?ݝWṛA-wsE|)3gWnu[wdhv uۦvqr. {q_$diڠ}avJ(^ݔ^󵖁衳5O@q>}n"BRdb["Q~OsP~j&&j#ҜPEC243.@I3+H!5M;' ^K {PDi0&0 v BmeAlQ R8I3x+&@<5Js2crEO~*^8ըjӾB 7T)cZ)RK>T?IaD⏨C |ܢKWxm\CSk}!^yMw/dQ >-lC}+O[0{\ďp㑿ٺ:qOn0{>G#ēXt͡޻ןa⌐[ b:cB[Lg6 [}GḦ́+B9*q= REқ*r?]tJzL{oxeFIX},$ԗѢyf= :fVnf@eFN;xkfh5YZet=WTkb8&MXobnṉH|p3%v ]CwE9_xwCp'v&'}7ZPt#4NޠG3QD+΂cd8/IW4$/БTc<h2Zl͸d( ) ӊ  zf 'k#ly/qA,ctV7qq9{m.d62snz׽ܟY":Wz .U=;9TK+j 8cN%(_k9(qjCmQ=kƙ GjD KYf֖$Q\,/gSZY/K@2SqB9n#d>ITEBDO4DNkf`P*pZe/,$ k:2ű&\Mz%D ѨV QN=`lvЯq479O=U?qioJ $rI0PVJ6Jaz_($).5BւkBm,JD£JCHJg-!)UBG7 +펹 msQ5`tT8NdXQI%hSyP-!b4#t޻+Ʃ)P'cHg4wLAj. INA% p-2)Ƌ~UEUAs^%.Y̋BUyn'_?4MUy{Tl:_T*JZߪ0qדjolzj\ɛ7՛{SU׊~>Xznnz{3|GKr[9 v>J[j/*m:ȫ֌6hbV2`[[/qQ}VbSj:LBozD'44U1^TKzbpiӜT;#DL'1]V+- u._|ߙjq4XT4Lݒ_v"d[ŨiE ]h)8 3XIEzMWP-NB'Г99 '4oN'qlg_&->N2?id`/f<ܠm)"vF69E@ & b[j~w+z+^Ze%l~5?O'vݛiJNĆ\c/[Us~h&{uK C]JuXnҜdBm'tp<1熈'V 2:y [qW+z0nULe6[/еn)ȷKδw)fhlW5Wg!@FhJQÉ֞Ji m>(A4:Q+C]`5^8@9F"!4K(\=;78Gpޫ&O5J&jdeUt ]/a:ztQ=5S$CSp‹oH(Qw&oo GsNbD7+'3LY-b tHI)gN:..֫V01'_I/J$ɹ)A)*RC4㪓cHo1dPэG{s_}Žw|j1pBpQ {(<CS>llx" -AxjwR vXMr9 ;t~xtsr'>ʷM62zfW@txއP{@_N ⊠7ypV2s 17.N'.IPDRBHψZ>1424t!bSV蘚0S-=z㫺lG_|E-bza⟇ PŸ MpGKUdBxQ/q*,!ꝫj)jo - 00${19Dp\FMu=ZWvZR$̦#\%>v~.֓VI(/q#?à] PpXb#@wQ 6 2 NדgwzSanA7ܮȨaz\zj-PR@m,-k) A**!?>]!,pmf˳7%05nE l>t*D*p wk_ { ?{mQJ&1F8yfѐ7HJ"cQl"k<5` *=r!K``!fXѠ#I5> 'H PCJSG=r`Tir#ez~XJ1 GFrDrM 8 -k_%Cd 3iu!ҽ7 <,+G )Ɓ3ud$&>cua){9 H%3G֝Km/5a+쓯3lIf|oxBhjD(\ ̃1 l!] FS .8( \wfM!7~t=JFv**_q:w~|G{bu,*t ڀ]@p8 Yd-Eo0$PL2 $pþ|  zcKMCxVұ 7AļlM*PlpvҲ, _4W tVd&#IQqH$[U)=!Q) ;].4˘B:R")9IE$s2JaQiƝV2A  l (O;9{ѹKlB?=xN({LhnAM^qƒc%p]/7n'b +`XՁPzg3˸t6s„S#xC$%aE tHy7&EY,ɕX=LvSM~ 4K)CIz{DŽ(DJtxIWJ0.S00 !~qCSG0t5"KpaVu"XJ^ YO0 ^ J1u+!{@04,(tH]}C֏/.pKL0aHP=<_ $j=0qCU|& cj.Ǔ{ ~Tw^T7\?4+g+:#_/~<}9HR WW*SӟNy?fjm-]5Cڛ؈>L$~=ѬEo|im[ նE$g4J0#a߃KT#yO1.sRnx[ׯow^X#`\T-y;&,vϟbvBĪ5Ip<"b$Q0ZEN` NY[pӍGa|u7ALhbâD-imNw `٠xj!taBw`/\JkWr~6KၱXjT ʔV0br0V"oD|qo7 E 8 ЄX"KUb^]a>pwI7p_0o+|սH5ݨl5V8Ҏ<R>00.xMrD9SPڒ-@G$HGۀ_IHӷ^:ׂNv,%5q~>ʰAz MK"VGYDZXWO?9K*Y0R`D$4ezX7"*xetC#!R86 R5y1dШS(j2L"k/%%\2N@S ðq 3fz7S&OOuQN:uf1e w9r?=BhBEu1H؉`8 QXp) ҩ 3-qylF{F\h*l R#k\ ZyB=7d(,ULO ~E #Jb(Shq뤴ʻE`;,テT,yq+U{ZJ׿\ݼ2e_<0bQ 2)V2""&ZH0mst'j/9v֖R2|%gS4aoE{XTk P*? $ܢtim܋޴b4ZK趍-ޚ}~cA1Na=t +u2XֿO.U+wz١IUE, X=K P^) b/msjV)5 ho{Z\< AX1%$B?F9d_`GZQ[ǿ3|6ܟ爿^.$'Giq L=Y7B|Zah^ +)%NhscGG0H3f@|/,^t-m,k'a6oeϭb.W(5~,Ӕa\a}؄wWhE)ڹyxؙ݊d%':l+lic5.&MS&ft 'Χ6MtoeXsJ)-NG 㷗xX7|_[#;ڒd~HִŌ|bI7rm2۳ ƥ3*-C{fށDOc/cV?YF!Qvư..fOT`-W'E4ƞFډ}Oqn:f7I`.m[r9DS\u˫Y"V]j\~ y9RP-Rfʄ!ko|̥q euCˊtϮnX NnßX\A- 4T[3GSO@Z0Ru܀Vܣ9Va_$~'fQzo0--p? fTK~W灧۔k3 ʻh< ׿o`u0pzñ' lN(aCRor{ @2 m 6`RHISϬD×Iw$8 Y*:)L" KErl sMÞ5WcqRk,*P䵕G)2l#piyJJE0nO?2m90H~8,•eZaD9{+C0!UXa#ūL`^ˈiDk45[!-UTkJ)Zh)FΝ 9II9I這 EaOiQ wCZZ/8lfX:<j3r'4 -ht 3 JGgABz)ҵ-13ӌ}k8[X %TQF(P@?{Ƒeiޏa1awIevA=meң EYlZ6_$}{ϩS̹U0#&?bw!h*S=^bx6;[ܯDh#!QDa#.Fl<1iOՇT&C*R^H1#3dɺ5?L{sq9jQU-wTM%s$+-Atz9)!K&<V̇%Jѻ)D$6)hK ce_>Ja"Ҙ*NqVc1zҺXt):Q} ]&|J'{X$m,F̞$^TG.I{V J IPkCuYv ӌT"yˌoh|z` WzJcϨ-OX gb {'0N0vu`;8V(4 ICZ] ڢY cMm.tt%fB\ZЬ2vڥzzEK05eB`^# Lׅ QYjXkY L  "ջVLU:zR")4`k۪nWH+5+02ЄFf42Ni2m5V^XcxeY j3xWh%WG#7 c:Zom@ 1S$R Lh ؍ttg-JQC(]e֜G5cyM:M;"p`L ӌ/%RB`YLu>% qci'9mL:v_#*A65TtgBQ!]`(;:/ͤ=kGRP6Z'o!A!NM!Jr{FSQvOJ "/[.bQ4F^n!GAOPgkIk (DP(46j"b $C aUhޣƻ ~,B\FФq 0(4ne3RTiƬ⤁1&bb trB6!>dUaC9e^kTе#[˂w-o ئ[o(yi<olJ#+J$W{H Tb0P <9?n _fRcA39zjN(dUʫ F&deZCPMhL}辬 kIt u< Q o!cHp+KP-"V1hFymV D v*"}Xݏ7b~N{)NH7di"YrB,GS 0"(Qw@ҧxpC*Kt]H_>#TϺ^H@ e@PJNZ%l-ѧFk.IŽa:L1^B ;f!3ڄ`[v#hXp|@(Ф pd!.h]qcQmBg$0S(ͮJPHƁgUkUPaRB]$IY6 (jS,КϽ't{XI3,Iƒ5$RYI:kJMKKުh{9ExGoR Q$4yxZR0 z1dkJqc=yym9ry6ߖkonT f,% @ztqt*D6IҢGJ$IlQfu6tk*޵%'ՓFCon0&|Wް[N 3bO*Vh E9)j>Tm~Ъ{(-HדJTp AP`AH hTU")RC<`뛶] + EISL1. xrSp͍9,JDy[0p0) p#X("#z!Ni,:W QYj(ZS{N낑Gj҂cMZ b $_\n^&&ci` Z،P-EB9y9޳NKiB]hyLoZajlJ؝")B Y[0m@IX©:ltiP/zCL Lčr(KFD9)⸟zul)\YLH&b!C1 4&ITpIFi&% x@07QѫqJ X)+E%AIV@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+]%83=%J)`;|%|&7(k@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X JoW L |Lg(F C@RV}Jhqb%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V}J /R`ԸQ5Е@du@^ c%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J VQ_Zz~=yˌRՂ_ݥz\wR׳͕;@k@뫳sO2jsI&q>bI:O'ҿaMdgrφf 2]MS0|n>Z`1h o;Y s2~ճf T޶n ޴᷊|qxc. ^43\]|8{ w.zJH%k B|wrj,]^%B?niEB+AQlC8ġ+#[gy9^εcVզEf(!(H+Cd/Ed'gi0W y0)4=)c<"{YíYUd1A}To Y Pa #Na A##<ķWǘp+cĬ{yiƬGg8v={qŘ鰓~ag`8ſǂ#{9iN> cA#7Z`.рc\:,p)<gQAٳGrq/v{$&km\\qD<< ګɫ+17?lȋ- bAzswO=tci?b>cr iȴ6ڨfD{8yz=930Ƿ  vPW=wH"d>}7׏nt-\~;PwaVW]nc6q:Z jtI.HnJ뱯\l=z֘~esնOj'-~ r(JoMdR~{S?Pn?X߮ ?V_oI >u4- bi5\_\}lz; Y6{>SSfdvn*~m6ARn]=- *?կcG#>ӡ,A_];i(TZ]?lv]og˷k43>?ܟ߶ n|^9>Kk޶ zV_5:yۿ݂]/w٦78q{mAQ߃?;qm7WQ~_.7(wbH+!-/8tNj3OZF#oA0JByE[l4ם *ѾZ4""^6skvX[IF04M8 59OC!'^??v/jR:X'{/={tN{{t*S(9:>\eq-YEu6ҶzOȔМɴYȃwBoUkӳ~󒇇'/6|ޡ7[-o 7;$x6e 3-Njynkȿ87p$m XiCi1Ƈ /;ݼBr|i)vokqM=Cz&ĹP}+?MKC8NgO*SrNUVH% NiS E9q=y8x-$-YJȨ^K^K^K~;Y{ዝ(Y; fe (LSP&0wOC;p47 $BSr0V! VF $Dl'0/Z<=1tD^IZs͉|bj> [RޜRMrHNIUJdR1Ğ-mjIg8|ّ3LT38%g D_FC=xdzwȝWߚP[FYrB]4㻒-[^9Q>Y E)JkIٝ2]m{d}3MaܾqP7m֖s*iʯ=Xw cB-a|k/tG֭K BDdd*O*Q.M_42I:.JV̖\M:v)'oٻ6#U'; svoxB?%)!)ʧSjD$Z/q=]տsL_- =#uHM.هޡsKQQp !\.FtjGυ "Ȇ5hI{46`0+W3K՗a7p+enVU.a1|.! P ~jۏwcnT&VkjX>)m.eƔGG'àr*qN(4'3jPF()9̥ )gn>k?=/P^FpwV Ϭ@v@,2Dh Wj[\a..qĤ5Eu'#$*aftߛ7V_AU5!,jC.75Xy(6YqG.P9Blͩn=gXZ=|fqUn!F-W\/q /vjoݕw?BF4$aa67{dI)p*|2.zsx=\\`兀u9ɦQ^y6ÒGn?LfeB9ʯ4ȖszF~Ga*:ǃ0^v/~?}ۋˏoAx?.O , H&6VX(Gpg}6] -FZ.g]/xq5r e-aX%qa͗Gr6z<ٞز54V ?WtN8hz'}}@$lFw輳TntٙtvzםqQ{םy50;ozБ(Z6h ҥ0RA!=*:W#0r*N5_RscZu$MmX:\FuA?ߨPUź,p+mJn ]*KʹyZ?EeځfJ$D rjV 0D;{ʹLO#,|(ZLKy  6& N{MP[Mq&8D\j@Z!Z L8Z>"Ԙ!V'S.yԬ^iϺv팜2I*N"$#%J2ᅡb D4RD Az*]|`w9!oC?Mr ϓl(|> *[gW0|8gopQZ7˙pd )ӄ<XLr ׈7|D?"9|RmYk.~1 S9Oδej}=/{k > PN_8-*}䐬8. 3j7Id TDwYz3D6:!3!2NC(wvQnlkywCR_ıA2MWxX6ve*Yl̗2Azޢվv,Whܒ *+ղ o\Yk"6 떡 ep׹׏mښ/ 4j0-B9mqͮ➹DҦPlK|GOFnJӠ E] ' -x>G9yowFnUtݹ+kнj& BR Gs{0T ΣJ1 1I|,dy P:s}b,1~.ՔWNsxrwb.5DaMA1cI:"TQ1!^"\5d$T4*7(K^/,9zWR2ODE[jDN,c C*aEKŹ9]q O=>s/uZ4l[p`>~.>/u!3׍vjW.Y1nhdV+I1@Q-*.+<!s\.߈?ƅ3 %Ku`P'}:I2p[1c])mh"ʍ11w>O/8OOn5;<gкhnԒ0ٍ(;av14eCCVc| >EHm2dtZto(9{ݦ{EL?egbK۽ehnwm{_^hJS´q6h~웦.GJ5ǒgs猐uɅdv[? wb2\#5-[[wQBCEzQ@)h ICF=%C+O2^ECQ'w?wOS#zbjnGƍɬ,%tTQ*v`)a'DnI0mKa9vmL0OHe*Q+:# } FQV29h3r8rx-K1m#:}̞C*!si,@6)AUMDr)^]<,/-?*T^e.}@P -Hx@ `z4v !iJΥҸIr/`|0N驳t X@2D'Rȣ.@*" ODH<a U[k4O/>t}V^ub)= F| +Ilτl06.r)𻻧|ӫǓa}R{R껁m,[>f؜ΡLzq/e.2by7\\m pXm0sWh CzTGHHx뽈VB ,S$ *f`tKaN v #ii:Tq\h*-*ecUdъc7@AJQ sj5:ED{Z TB%D9X(iRETL-Fs?2dJ6|&hĂa(LiWf獯fq3 ME_3āR5]/#rrGP\u&bAА*{X.zNtl^Cq~ A\&g,$XD%@&8QC1q) Fh-'wP)9ٻ߶n$<ElCp]{۠ACrh#K^IN.Ӈ?jYܨ ߣ3!gJ zh'炨E伵zR:e#Z ;_$Ϯ -jU ˨R16jT+rv>¢*IatNz<@"SPoټJRvO o Y*6r-'[HX lt1hmNRV ~OZ '5%ɁM%  BHKX=m IN+G8+/)Ugby{-VD2Z*GJ3HTQ%XHι@$GAgQ[aڮ&J CvY:s\VŚpxkpkQwVۥkN:CƝ,&)ؤfރMpm<]s`HV+-jl9KHh,BR<e-:\bV٨PO)V{]=#6aW;52jI8dir!)2g^[޽z^jvjopC*'Q蝊d ڲlCѵbkQ)Xz`c-JcZUPh%tqi+VKJiS[_vm߾NƗˋ^'EB˯5f:$l~W_5E_WJ?o^^(5ix5)bvjEۘx=xYu/x:5~ovz2ZVh/9 CLURd fy9iIO8 xsDi3.<#'S5:F538JT .''l"ƣ|},Wæ?sV7<< 5w5 8 w5}0]~fj U襇 `NM+<[5;H%Kqrn 4'8E^Lg^0{Yhd`(D˗m$+ JVkfAzӥTJbFϘك9ч?}GlKv\+Xx^ʝm%%<_.)Šųi nPE@uL$&?~?VG?|c3[u3}{5븦{s3sV_J{ūs{K67rU|>YvpYzX5c O%jCuƓ\[wK y+n1:{ *+,] 8tGaΔK7!=q2źQU\;FKCly)V5Es $Ȣ+;+LV N] 1) H)ҎgMC؏l9$k:(9^ j>8'#)){}( s{G65SCACG`r(L,6c/š|]B躵-n=xXo% QKWF—D,vsIܩF:,*w15$cJ|2f HK9`.H k(B0)8jKX&ۣM؍g{e3L߼Eڪl#7?FN'w_VGm)VzZ"lEeX(iwE{mݳyCI,H=*\l։HhÚJx+Mk0c5^GC*wvю/N%;cXQ4I?[e" VG#>.dOxu]*jngaj3bl$j%rm^ 51iTxZ[󻇎*}DD=@:fD EErZClfNLIbX{Jݯe4lRkm?J&G2PTWC}:bpi4XTZg9\6T=CzdJVG -e_3LyĎ6Yyv4a/l"~'Z_k#c!Q,l8%S-'!Bn$l(BQPRI"u!5Ez*G *1К@spTɀG![~ָy7rܺ|hmX'Xg[ S}(19m,"jo[Hkls;H 6e (!)^gS& BRONgWN&GO= jol-}>RlU 'g7>S?M;uMC{|qyrvZ7ˏ?UK~aʛczi=*P@HN9a5YIl-rPjֻsQԃ%( =.UKjU`47r6#cJoXg슅g,tG,|R,\QT6o]F84͆?ov)}ooؖ}(1k]0 ʖ,b1f ('"-: SqC7ó;P=R6uh5!¾HH[I|}xo}qǮzFpDgk!e2o ;X45(\dĬZ˗M~KcJf-CEk2d%I,T\Ljrw%ٌS_3&L{ш~ш#" Q8ުIx%#IQ2['T)XmY֭3j:et< 4dPcQtZ315;)'-S@'6Jψ9[⩎v!\슋g\#.q1RQX,Y+ 5BV |@1Jq)pP7Uxi.OaI 'kJO3tVJ6yEE1ǰec81*WяW<*yjLq'BOW4yyvH/Ӭ);\ҿ8HQW_͋u=Wot^|Mw+j%UZĒ.[2ZnE3ulwѫ.E9[e:C_mR]K$^MFӹ( zqtE+?=icɹ_MV^NyYޒmTڅ"ZTl5̥"T5n/d ں<7t_xwB&/XmeUa}}\CWc!P#c&p|x^*#ڥ\E%Mf BuR&6VXrV9L/e<; îV=dqo (rnڼljr?F%슏ZvY/b[t6^VwVk2knZhc4Ʉ5o[+{kԽV#V:a].ϡ_pm5]w3M3к0Ъ;CX˖4F E傁aXa2n\|V3y>|rdQ徆+ocdwv/c LJI *,#U"De!YSG PXvM׳Iiaw O&bݷ8ZZjL?-heAHB(6MDtk'b@dv^ՀX)4zܴk]OJ&f0,SV(nw"[L}=-f`;i@<<ػ8]M9kG)uiX{ dY%E0r-̀DVMs>֫W#6R)ƤٿwtPmTdKʐh$!J,Z3mKHְ04ၹmbj}D,%.mU'QX aG3ZQoDR^€S!J! fpw˰aJ=KNJK$3s(!S#X>&f2 s)ޕCl=09^ O A†Q #H)#h_.nAJCd.!>F%ſ'%퐬 ZJU'(meƄT18'UIB<ۢEABR@A¦Qg7SSkk`wFO$2jd84i3A0X< ͭ]b \""͘u&"9ip>BT6zk$% !ĄmB0a|dN11].D nӅDOϪ&6ፍ`ætdYt4ɕ.$ U40&;؊D4cAL6r;XA'y@y[0ja:0V((ƈtita`Y;:P!E@ O QѪ\ E EΩ X{umhҢ4*EZ0y:ҚI226m՜QΡ~Bmrs:-燝 ]TaLMfL+vZ&z X[T  `ae J z΀|%zIѺO[pJnAnQ%8QZ{*.(=u@B%bIPR4 V:^h .* 7m TR5bUۙ~4D2 XȎl1"iCtH;tR@Ǖ0Z`4b(m*?BPޙGDh N}UQ~Ջ6BuMnⶓ Eh'jcZSn,Κ67g2Ύ 駕[Zjo  2Ъ;7/l:=M߿W:W[ wP寧04g}LJ (`GQ]%ǤB@F +m`%Ч5+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@(NǤ# x@\_j+ @(+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@4(|="%uh@6Wʳa%'B4 J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@@{e߹ޫ_Fi)^_W77εl(Oϯ6Wn),M<Ӣ鼴7=.&QK`mg9J.}4 HtrVO[lOhlkLr{_jX9Mρ<=Y6E/2Ҭ=ߘۧ]5.yjd#Ƽ>ðY iJ&Mt翶;ANEɕɕ?_O&>y2Ḅq8->vkf\{lS75tB_޲[oi~-->iz2~]}1qvLE=9E1~2oOkBӌ賓)&PuO$K]ȭpDɹ-rgT! >7z꧷n&~@ys|ۡe)jdͤFFNv?}~ gیϸ\{CE_N;ߞ]]CeTAqdWلգ2gN UE>ӧWYۡDϟOW.x ޷hÂWV}F/~1tz9a˾ll#!WCzE;y9'o6yڷ"ۣW"}`2twsދkn1X^Pq">aSɦflo-"z!e< HZ{"(Q{`*\{W\Tfܼ6+JRJĆ$_KŒ}(ac 9fߵ!JvҢR>x {xZn1?yk`moQM+l!s-roFQRU \wOU=‘2 wft5Ư>،&iQ& hZbԶKQI޸w-.]tӏjѶ}oUn-AAOHge-':Gĸx=67QoN]ݜǐ=:;T {uxY*cn.ض4%l4k˕~hٍi Iw/8ia 8nOE>4fJykv資 +[0k!`L4}`5uEOZDuܸ_7Yo?^\yދukߓ:u Z}-pć[^ysqf`']Y\x:*5˞\sπPd6Wa7 tirj4ޑ`TVܙ|< ֬GNwϲ'3|#9߰SΧ3\V´TϫfIHku5 t ڮEvMO'aǻ=6`9]Y0J<~w-=^5WY}fˑg~cHllla vjLh&KfhI3s27 6PXwA,1CFArNܟy;9vQ.-3'.q4X$p4oU#;aS+F><:d'>p+ۦ2 [:9М_ə<'O 屧lzE0yQ2:p`B8Kɽgك穏-ғ 󮳪2YNyrYϳsݽNB*uםS |^{!o'ON/7/~L0~Փ5~]\2֫yPrBTp(ԥ7`ώRyjv̦/x7hTﶦ+-GĽ}3t>=]l'|,u[{}Q_zixk;yRm.ƙо|НeorJߌڝՌ/ n K/ikpojFho52>ol mkzyWC!v l) 7Wgc~.~yLYz T@f|%׽WZʕ.A?{TM¿O:y(׫+jkU]̾go<^g?uɎΏ ؜i+71D&1 *kxxћa_89c?8zr~;2}_thhv#SxH]ѩWb19's$& f.Re;޺'>p`Nutޟ I;\(Z~C$D{P++Ƀkae-DCNә+Ayhz遀<1f}EI&'3=f6f"CkmxBe@Wڔ+Vٻ6%Wl-bYX,{r'gց1W2o& =38H==v:☔fHDP22+Y4H+n>oޜ5ihZl3(\_crF!]'Ke%*eUγuzi=m;d P6+qJ99[$ndžx` &k.YNlfVmղ@ Vq& ۣ7[2yZIYF)l;BkPF66'g!l4V Yf&PM&oVQ3g(F N  V,dAU$XLR ZJ梚JȘb!Y!tYێGt^oΟ_ x$kYV˩_0+3 ZjV/_Qg>}l#ğ?N޾r+DW pHZ>pxAa35"u-$ItYm52vҒ/ YٟhMmd%ddp Uq@4 a'nܳtSpc P) bPJG8lOJFq#0ZƳWMf7gdIӖߞ_%Ker_?7r0 I0@Bʳ,$ KuXOvKt拶SMsFY HUR[;+lךU) {iRfI*ye$սfTT@J ))((U& (M$[h*\,皱`bݫ3Z`GR0ˊfEQջP I3g< SŲ$X!JeäHL23!aQrDNVT;/9PM򶵫ڢnm[ϫMdg7LlAkۜuF'bd.'<Μf;1\ɒ `j#a1Y %Ph'Y$c9#vĭhg8aDK)w 1dX3I x(ݶ7=&6@*f=`S! #J(9Hv8yS ȝ3,Mv)5h`iŌ`7 lX> B0ց&+U2^F 3؈S@BLVmbBІ3,Cl8@(M~HC,8VHa5lj%Ng\S,h ##Ug *3u .tx(ٰpz,f;Af]|;S1gj٨P*G Y2/C6{n"%}VZ'xdL!n+f8 1Z%" x^ v!;=E2° qA,!4 Qc# cL "4~DH?e:lc'N>Bt;c $͉#y+0-t[)1Zj(tzA!㭊aw c#mL[LV(,KHRM<Œԏ9qӤ@e zE1쮲ՠNQO:׌3@D$Y1B\ Le4MR#,ܯs[Bk2vPVZ|~V47yel=qɄWךt 9 c\ pF3Xi~e4hCSԔz>!ljTBaR;w_};ir3|;RL7ā-F: "ᛅL'.xu6ggˣ)Sy Ɗ.j5#qɷ_ .v|h&bZvs$0w지`<%O3Jzr8jVx>ݻ)K`=Aj. BC4݃`A{4Am$Gەg\Lr.K0x^{GA<|Rǎ_8cSB|qf)O(=llR%70,O;0v2naθ>PVۡIMOӳVBhݹ8ɏ{.r1u"_Qګ %^yɸcɔ.)(h<1;sRB_nY;M z*>xf3Ppv4Bg,Ijtbvr9|EmZ1w'f8+|-*XjxwJ?tELDpOwFtw`df.\!)噝KEBA79]ޠ=U\ݻ1B`ڃw(jBG:M΢Xq?,\ O]´ {:Uv h^nZw8tsVb } `. Ž]6kd RN\Z"7P1FI7R\)%|FL1ϭ G *kd\gi2v<锶ꞈ?:;]TWHخRLtP&/>4JZj%w4`byXw y4p ~3HVLpP"[\A})81'Yt;P~0r3ƽNy&t ?Xq?k,Ξ]һ* TڳU +M0]yI~:%TJV &@"uB+T(S3807i+: jw44]9SɂF[ƷFߎc#?X*eI'73@9u`TvZWj[ǭxoTJ3usb'q{qw:sRYC;}Q7#b26 1^qvʂN&$O6,Uk˓Y(ZNT#}peυvTž*4)JIX|tSvi3+9Eۛۦ?ޙ2@jF7fWtZyZ;@bREU.RN )80C %\.l.\H)<;9<4ժuG吒1;%~I?E'!^b<Ӗ4Sgte}U~EЫbz* vS$i&ٴߘf!59bbT~{hĿV?gI*S||u??ߓIF[s1]6NvD/(XBeB)px8-D -ӹTS_+RL^Yg0f#Lwwb|<`"O[uU/.LH;pXTx/4NI +T &fw?R.RT*;JIh/'СseP)N'QtC8{Fuo s$PA < -J\(XBy|2M2< d;GQl[*(T@td=7' vMtԆPPG )@km#wPMٞ&۔cX1wLL{=vkԛZb"#RNs 9OW0͊DPpÝZfޚiiНc<#񆢤G1mnH@eK4ϿRmEnT<^Q˖ד\`( aB,^#h픶+-)PpPʯASxg.Kn&<Z\D%iCHP R%ĂB}{F%[5y1 `K-V{Z 4yN#Xris1슡$9 d6ex;x1?Is|:_ldumo 0%cz.Ts9NJ eXo_ +.$";6=^ɋ'E K.gZǑb,Cw`lA>.&@nGL=eI=:awzbŪbUܹ8eVJ5{[fu$"$cFH‰5z߸ҸKbb_ffnش*7y#avc2r4{,/{~{A8o ԂtL)kus1zipQrG^4 ~wL\3w]WT:zt0$N4\,KVZ=3:A/rȳeuIe^x,qOaKdhDuF+X.gK2N}܆^a'ˇ5IB5LoSi KΏE$Ͽ0Cl?xVR(] uwO~Pv} Xb,~ҔCiRK7tz9(՘vD "ͭH\1guÑֹt-^ "sќ^ң. 7@93 wMqd"5@VKT'0`KqSQgzF8"Jl )uk6rg˺i38ϲ(NBt|ZO2Y 4zI$ODУqgwxUoO˥U>0l^h1OnS8Ef16>_K@#!ĝZ+FϜ8˙pbU|qЭMT02C|Q1r>0R23{DC "H@R,3nJ9{Z_@]_W& Z-/fp`2VƇTDFU?(gw~Ak,xX|5&G¤bchFYyCOp ҷTX`? j"awUDc}'h,XhGvޏ{;aF07o !wϏ-ɨwU~K<ݥ>h_ |GFs]/@tU[;cb{ȹqWoJ!CИ~[Wp?@89۵9kWȠ$(4ޞ#?ڮ5Jg/>Mbf#,)Gie^*-6H~*]tzz|Kk4>)<#壊Awm{-L3Z '?cG~Aa 槙hT]\=viNhh,zN}pw8 z["_5\Eʽ³Zp gB&EXVk{%&ڇfMHh]ٷuV,*hc1 ɞ13BLB / S: nMEՎoP*d ZgoG^|Rqh\jV4{};>s{0)cC㝉&qqJpXzM|)τRd©<׿˸G}sA'5i"Xc0?"x7-5 )3Kԡi}#KpP&qq;l"R/HljxQc-Tfr Hׅm; $ ū(LM춬M_| 7)I/z,u2b? iJAœ3ZDMpJD詩IFNV(#Z˻Tŋz^n7lDΩdS{6WewaAIL;7N.p V24ًWĚa|j/gP|]_Kτ[?ag h2Vߩ{(Bku{ϖ8ШӠ].8O?Ul~sc=:Se~qDn&8_/n\K6y?ü~Wh4Ϋ$KS`W*a`y*Lc2HA%e ڨVJc&kg?P *<#3d$xJ p; <1e~AN}I=^7(1:r4+|*|z9uy1:!V5$^ΆeCSp'PW8Td-Ԣ44B5IQagʍy&e5]kĘq.ȃ3͈W!a7QE}bS0vnǷ3(a<&|x;cT2x|x%$kZPis^4 P޹"9*"Ju q`P)I|ˮ T2;d"Ќ.Io:0垀>8ܜٓ%bͳ VD!P$8W8:f1\vx+O^bEתBIߖ4isɤ kt- 7Y36,ŭ|G]WˆWor 3?Q2uOUa}6xXՂcEiX(aֻ?,w~>i1"=s1]i|>ddMUSpvOlcJZhUd~77;sWtpk /:eyֲ͇ $|`8Tw.] $Voo甛ݮ|tHm֍*Z" r (TM :ҫ>;{ R Al$nZͅoG֫mzeN HH@_Ɛ׽4{.hź6yv3 Ĥݹ"pR4\S: fVB z ,Sa< ^6Ln'iq/쀲H;=2b$L#R0dʘҦPA{$, -&MeVO-ㅄ ]"PsSR.04k-- d:)2c=nqY"B/9a NsE?v/_7ndD=̫Y7[@@a,phz-f1TDƬ.Fg&!/'ED ;C#]~KVj{4M(IIzv\4;SGOufd M(4*3@0g*}>}wpd7^QXd,b⎅ӄ#kQ+-0LsJ:Q?&| hnZqG'L"it#k|Dq]Z0èjtˌ cy"սA3!V 9PdCD"XT3td׃M'48PHXʝ B'x nLֹqCKksV"3w-"iTI!(!Z"B2fD%}I[oxV_k"4Vf`aGb7Tx(9Z*,Y;@- @nGǺHu]UA_xW'"|7*S=dD$1ʃH1.0˷zWuvO@Y3-1jg^ȱF;?rMeI(XFf083#\@b(s s\{'[o(*I~;VSNXt@zb cw#%e2Ϲk{hA= xd/ շ`2p@szbZ r8ʓdQ*q-zaGfo/"ub4FeAc7ʛ~# S{5hjW{F(alQJ^1qM7W(>%e uUv |}k^`'tH9Lg"/l?WU}vț$i;OTKs!3IЁoH; jV6& ɭ,qXpUn\i`1뻛cqoQD2AS#Q_:*o̽C۸|aDe=6v'ۆ=q7jYpRjcZ6jug_Q1آڌ]O>vb0&{t*>_lH^:J71ݣ.-,>a^|bTeqR2fufH(jFQ]>B髟9Y##NB7`rU[gŸ~VlKF>/rیh(΋ҭcI}ѕ֍wKhY{ĉf`@{ wȈ9%XI鏗}TT,i(F1Y{ՌC?)ϓb@{HD}<a0JqxxIf:TPeݭMOu~ &W]@٬NRԬ^w)TDPT\C0>OiرY3֜jVIS^.4 c52yuO\`w&gn$tU'S݊#:Tboϯʮ7W }Vkؔ Ki04ө9zBgEp: xaq WShfB@V̛e£NiѪha H\1$.s%,QQTz[HBxz`%[##:V3z!A?gJϭLLvx52v,8xIYة綨(OVug;gA_yո8r:sՠU`}2tӀ3ƛٸ>;u(;^Fȥ[wѰiTS$*N; R*~CLCK x==vX HqDqBnx|*;Wg 79I( ๊9ӟ` a??Z]Bsv-1Trɗ=XmCHq ~Qi6:ҠwdI@2i#x&iK\b5lJڐTSu,9_LyP 6PeZXȓ6'Alz$of;:w~dS(TWP܉gۉH _:WJ_mS !NXo}+ |Pxoaչ]Ԓp o\\j2C*Oh\ .{{jBBCje Ekr'`:zB֙j|;mvYQmp.7L Ceml^|϶G3H8 ION`:*iY##TT0txP\jdDХӣpO땨8N%EC1:&wDōu n'ugc^H6MOMqQGUnrvr=<Q@*XpК!A48EӬ$*_+nuq:} s~]ijJ @%͜J1FD*I=SDވAⷴͬJ&Q9l\觜Gvx 04lbRI;ED(ISI 4pg\BjdͦEb+*G1j4}<&l+P+5D\g2Ή6)&ЃԆX9%GR5ӵ"Pj0:Efʡ c(jvښǥ r% *ssJCF"M'/?Y'gk &xʍiPyDpIME>'nc p+[&nF?6ٙ c7.{gc? jbX@ٙ],E7zK3#6gd>5f .dsdZLYoI.|b Q jYMQL]A ])8MCqk0"_ EplH1 N ;gk٘/a.X8.ooӸ0r2r佽q =9W7P6jdZ 5a0$QXI <וQT=w'y.q4\gR## )Xu%Kޯ^2k9pه m9,Q]a ʮ]T-=4D(&mf=-{WD9rU_0FƑxcv(Y}9c\mFkdDIwLujj &۶`w.!: &,W!x?ۭ LfoH,vhf4{§Q +rp` Ôx8X&ޣt&|ՎI~<uu@HQg5{=AZTxb6f]~ 2352"n8 S͆pg8Zr$[Q(n*Rw>d՟oMWimY6YS&<=X?Iil$&Ne -RGsLj4vZ-g%Np"Nko߿_̪uAǽ}4wGKΉ`"ci(LH&賴4[wD>Ji^xҘ&N52"kFig-nW) >ӴcG]PU$l0Վ GYјNacOf(qQ$KYFqא,} 5}cE;P8),F :0jv仉~+0᪳Q!'սa&VF7(0ᦳ!WMvtck@rhBk4eѸmHzܯ)at|eS5c9[c>} :$R^jLV,%e{:]s%UIHMZbUSL3"XU- :*̽56.1d=6VhJBMdU{GV.e$h<'."Hfr6fjb]tQFdCr]HZf9E*ĀP +]eR52>]ui K:MKr7y@CwQeC{- *@GmȂ^:0]Or RF> 7ϞC;4HڭAĭٍd.ПfzL$KIY_7hkdDlK\pD,w6cjdڕa5H鏗8"2pq} Q/.wZ'+JC2L][o8+B~)z{nb7 )2IŗJR5G(S]8xs?qusI ~ yTu-~\:pU EURgKV盧G޺[ܽ߄7`Ab1_xMj %Q'MF]67Dgܧ{I쨡E#EgAM t jzZo>iG0O%2lUk Fp[.?)9H?G43]^^X(8,*,s=qw+ CJjv Uۇ⦤g%+4} 4%LpPdpHoj,h ث1>G|V)֨c&q VMLZ/;,:hyVfưC3Bĉ +;NTaWIWI=_$%@Ԏei\LwvƑ$s|÷ֽ8}Sníej-gTt-v{5]Sb쮄ޢ=ryd~quikR䙨Nz3pҷI_aƛV='5Ȼ$Yjŕ_6钆R=[vXT2HIv~6MauH j /ж2Xvw}aƁc4-Sɕ+P=a#TbͰ·D$Lu74`:ܨγ`rUp#!i*j۞( NrϑC L1I5 &ݥucFlbjuɎ)T7&?Py>x|9ei9Zz1wFca48%] ;Єzqޅ1(Q3BwwĪx8; שKU#k!1+4v|6oKTCc@IyBc]:kǺ~ߝN4DԻRCc+,tٗjRra`olig߯Q,ʊMfYp1)DMNJa wX(M"^L?gCxr7ICEb52rroq)֏,VjQa_: uR;$lLPNZ*Mqovh.hpW܈챠RB4JbR]'pa.8ߩXP)$^=tRh\oPۻB!jV+H3gP$Jc.ٚƆs7_DC:/՘M(d/n2Clq;)lzP˫TӖHQG 8k\y!AM:jw=x]M5B;'Ԏ%kTم42)ʪİStZ"Fa]5 =ꢚOfGl;^fE4j</VC9ZWHwJ%65WW_,>|Xl6!CcSֿ8ή[vR :ϩd(\"ZW"P-jOI-Cq3>T}Lg-gIU=tRh˴+~/chlDT3 $0P4р4; ىHK/sZ!B^ >C+>zf&6jف׊nKVQ6>lAzMg9978xڕ(Zz^:ؘre@/SKy br 8/SoǦ4N^/^F{ 99p.Q=147oQ&(^J zVl/ :]\`KA:4aGEkՉpT LTٕBMh.Wj?aS6)OAڛg,U/b9סU]xK1z".ʃ`ȅKR bd !OtU[4Vj>C;V y,+#zpGHtIɮ)}kLk3MAgܧKˤRYњf'V5Ek˞c5vN se(-a=ݺ5GKv2 +nʾ}5Bq4vSY,tSxF54$iӗ;q́ V Lx=yڗ@\?*qNCu `h4/ڨ> #D,iwui2@޻ZXqڼix9ށƸ/Sbe.;m}oa;B$ BsU8Qpp ιlWmh$GŶ ]O;k8vv4xODW:Nxc"{v`\JS=pwsfjRUcKXۦbg5"s~^^1w8eD9ֵ^gJ׬$/8Ϧpuèqfƞ}V#>]׹:->t}q _ q Fwz_ٹ6EtRk7ciwuO}y54i M+쌗_=s+#7EAUH8CN4*c>T[QvXg0;X 6(u(mH|AU.JV0ڵ2dݹ];"77c][#he q|K#Bb!0d;@aAA 2S8HKu'}A -v"+h$Ժk2=(Tx%|T G@TJgb6=h UCR(>%3bq3u6v?iCapbƀS8(kk X"匢8IvGiT.ì5O;g} wJ!k5RS +P8-~:9d헟l9Dzy&ۂUOtZʈ.&F?o`sA}<|etg K:6, pDR Q \̿e%:0 :ÕR\g~_jjs =44;\<$ uJYj02ĀۧګW(q%r)]B0͒e~7R!k0ֻ;SQ@ 2W -[ύ#9fbgeyJ{svg ճ H5Vi2"GNU–m5HPTRZIOMW>#p"b[QT㧴ڤ_N"Ay㵁+!-ܝ&|jWb<}+D| [4q9\3!΍%&b JF~q5Gze+򀃏bS' ODn$e*JL BLm~_]nVz++o*}[B [|YfR|cgaEٿ++D̮_w>}C8 )CT;93ƣ+zjB)8rV-Ky \W#Y[mvzS]PHJdQ.FWh$omI hL3C~VKR)Vxts+kr޶v2ڏ9 q1wŗs1sU~1XHq asN0#RJ+ |!4sGm 6r C$6XL~m ~mk ;1T9I_{yny-3U&F"]fA c.Q jf%ȹ9͖.BIH0G}2T;ypBs*sp'y j}MȌ+<fT +iq%#8r Z1ٛhI7ђx16%OoMJB]I ;TV_*r!gK;~%C [*l3nFm<Ϝ=(hrSm%XK0Cq/a:!#g\mtIfK7= v|smb^b!y'=&s r ETB!'&ծS^lχ}OqȇZg@I2,oqf.bjH ZK#Wp"'ol&Ve0J8#KOkw Z;`QL)JaV"TjFv4i<턷2`nk$n_mOt0zJ0kcpv7tŸ3ʉz?sܹ!0^u $!JLw:k$3bu9{ Fs`HIޖP+w-<ݝӕQ)dzbƍXݘce3&~]<ܗ>S:#~[i`jt <9 aFY # y04d+-/꾉 %4OӪH!Oӂ`z>w^DJ9 i6"O}N (@Pp eH JhQ ?m UQ!i^mJWgE?#rvj Fun!鞅>gJ\bčHHO [jh*=$t2J%j~]MDS.]з2轩0fƻAQe gz㺑_1,-HV ) .ecIV$2^ߢ.Q>-A8i6Ū?pKbl1XtNݨo}@QHw8;(`˰@sm#mmMsmMor֛[>xkꖹVz#g$SO dDz(m2{ɜ#_}çÍjOfMRG(C)-RXzrn l)L eak&*]AdwI1[0YWra6&ZE: /\k$&/:sB4 `b6$!D@ْZjԅ&שߝ߅`_ k;aQ3֬|#O?A ߎO JGu_#E3l)ɶx8jj7z+d& H)f'6ahv.V~w8s׋D`V=f7GPjw+~A7*!BQb{rk%?^@![fACw*\`^6[FV}֔5& QDg{EF nG sDE7}ٝg^X~S'+ܤs5k|?c-68@|Ii`,!3)hאlP1Ew'b1(V!` h[m=;>3y 1K__OS{"v-˙}Lz {J! &nE\-\c \ D9䆎1K|-30viGޞlP?j5~I_ݫމnlH+~$b9%*jko4g_SrB 6DvC%WP>|ț39o *V'/e5k(79Av Zv/ wnmPkۨQlaU6,v`F:hWb̂9n3;ܝ1OF}=v iwmjE.<.u #s;_S _-#.>>L9|5y筮kv TO[44LeUw ǖZTA -b5͠fʧ!rwpzAu`S-$#f 6k9%Ɓ.rzWLs}&?BkrJF;tmG:rsy[\eat@D7@ TOY1yW6SWۚ &ks2lsQVrUuitԅ&N]|3;x@f`kׯzXٲ݆QpIc:V\YɄXqB1m.SvfkV5xo;)c:uN,z(tTN!F]PT՚5,ZÚ1_7rho0=x 2%y@+j(ϢY<>g13aK ELs}Dر17BC`tM#RqВ MaS;{rs^Y̜Wvi*}Xu3{l̜W6&߂l|}7_zwm`wO9q;ɢwO&uṟJ=^ DzJr{0z%^k&9^>˿"֮/R]Ծt2_xɞ .ދ=ԢS ahHATX;yʛHɛ|3UD}* y q>:2-Mfd4 7d]m&hQM7)bfI8_ s7 gk$*b+NdI=_| %4rmFOZrO-1:>{ro*u Ev%k ۋ%?+=. %jz}p_ ᐩ-ph;j¢KEiBPeĺiG*gmF1cWa{Ȣ%M6x)6iz`@ÛJP=CďY ?rjBZgTKz"rXTz>,j|O`>i^ '͏!`GOt{QWŷУ k .xʝ!~:~E_Hco:߯?B W{su~Ǒj0)XI뽠#ȉڻ~AOJ[!sA[FF*<ΈTD.9Oo~zWK=hVyFR'ݥ[i㲾EU͐&xoqroSrXը@By rM @>ȯ$ܲj:\؟ʋSg'JY79%ՠ(N *(((&wRɡ:*v14DOފ|FZ㒶*骃b(@XgG98E"s3egMSH3{&vӊG>G8j8g ɧBroTJY=?P޻J_Vv` }#ʅ]f5Z.47ٟ\nOqOCnnB KaGU9b6bՙ\J3)cHƙ4[) ص`-@ ^V trW[\QP *ŅI8>λ!\]DlOΉ,}Qո\֬O,CzE}bDoڑ]1 9&[sIP3n NmÎi{o݃.0usV|J+!ĒA _TU6iQvJ9s#SCb~N D[zEl"kLVƉ $Kĕ׹k{G tkbBre2mi !*yg(d=~^Eco(3囏VNsoO{߯yA+v>;~rW{}vv|b`~/5=Qo|l3ΣϾOG'?s5ޕBkݫv rO5O+;eG]#R}:׉M8"=m/?f*fASp9Ř +x~-|M;RG/l/:iuyVe {Q!z=(0v뿚S-'/60&FX"gX]"x/;f\&O~<ī}"Sk%.Rq[7[QL=1NHhbcD4,gt&:19 VvI([FN]hR}'\1)6謺FjmNCT犐QIgɥL\ 7@6I|BLZ+dBɢI˳ڽ qpwEI5HX"\⛒ UFD~n:ģ*R۱(jAeLn4dN^GKD~Q]Y#(zo.VO=n2U1تc^أ2c'/OaFW8pz.w0:/m˧dnzր47Ȩ+srxZ6'i#;,nB^_{()M~~[d_k;W~UbZXO1QO]Zn2 "Hmjqynư'Ah(Jep^y<= qAζK."W^Dϥ58`1{ {`';LSZn$ۄ f=Xzi{-pXDVUyUf9jNEcy0;E˂B퇌в]=VpTDօsAHYR5ݚ6Ty)Y}.4IM]ᎌ 꺨vጏ[x>/9C`5Vtb l|')ƶlahcp1Gt\AZh4wL]hwL](..!<%3W7Vsavy.>a.MGQrk~.ȹU:Ӥ&[:Ӫ0)ߖn8ȸٕpS5:1- JxB)n69#cI#OeMy6a690^ Pڍa˂ n^,-$ b-!fAGv]!E1;Ԛfd7#YA>qA7ٝS*UkKG3{j!`Lpq./DP/Nb~{U ӻ5Qsr_~wݵ 漮{^N_> ' >sQSW0pWƿ7Ha^@iS>a?QS∢"gFbF(ݤYMGiXUWq5v6˴zzj?ٷ~ZrgwNş/K;._[uN<dAk}+In0Di*ӹ|= *!13LRΰo;`0i(z7>My4'&ϧ_<5F*/>ڣ0RPz9>v6b6ghrX|G]a ؓp:D7.,"j"o%}Mq\+Wd)aHIejːf53НإzE~D{1RA|.\/:JIS_|(RZk fjHܨtA3 idL#fZϠK3sNfӐmx羅ڋ(wU+BSjZ+1ud>1gz?RPLaqo('ɺڔO.ah^ Xd;F!(:QcBDsd 3.gfF? ė̄K{;L9φeHvET0P·2גqMٶZwO Ǧ re܂న -W܌sPiXXp3t~ mԞS zcb:ܴ5 lby6ѠѴL{a>qXh0$(@k%3ء`GHsR9yP@P_Oѽ$ l#H aDz޽]õLu]~ͭUWssѤ"zeuy` wOnI(i]!h%]+Z 6f}ۭLї?F#@G|ɴs4û5ݶ ɺ-T/| hzox4{(w0"W<FDG"ѪލHxpl$<3*z%47߭Kw-o<%HwDG qؕ$w]Ԕ(l.1sk79`]{6{'-*U]S婺a¤p9p4/P^ !A$ũof|gGq۾\ * Y GJ.A8C-^@t;"RIM+0 >^@` fZ{b`oENpSsif{)HfD9}ۭcæ0,_:m/&"TOg G5N)6)l'k,| 1bԘ:ܻZ&D栁\cXkf`cD:GEƥ87^\(N'Dž|y / j AKg_R3F "0҂o&sR.̤K"K#Y-|;$K{w=@&rt+qY#e M.VMkf^0hLg翕^mkmkZoHjOZv7Dcl~nSIgi`ϔ!Ϡim<@.jkˉrC>C$BVlL bqJ:ޤ>L5 h(-$N\| Y|i[ vĵ%Kp3 82vI!ܐ K?}n/iա>Dr'!"ӿoԴ ﯭz})>`וsy֛WaWkӋ?}\%YxjVYNەIrH^g㜭j(.jψށ Q^;?":6`xϸ>ns?L?QYT?_9@pgj֒C 1w*`B+fgF>8:k܎fae[,GbȆmبb Ԇy׋jDZ[GbJws #o=%D A=YCa9&ܳx[w:}˜diacgӡn2؈w0v cKypK-ȥ =|cp2bLK?'~ˬA|0n0ȤUٜ~$*7o࢓)S( thc9A~MX4g}E[m&qz?MY6uy䝀kh7|MK]qW^ͫ^z jȣjto{pM?xξ6{ט$M#,>؈V`O;oS Ii# ) @Q,Ŏ[BY<&u-zW~5CCY!%i=HG{u1YGo[b\Lbӎ41;Ők9[m($ۖGHռi%pM#k \HٴsSfmZE?NK9wnC@1MBLfNhQ'@}K&r=A7֜dfa=Ovl걠*zA~DoļKaɚwµՄ&5!:MlsCmA4lPMx-k #eԩxD\Ϟ~Tv6b~ Q.~oW}vOQ曈3"Io&'񩶦'{!5Q\d/A{|^QKj(5'Q~,9LiSkH s`ؠ99u*Yhu')YDj=)D8J][q毘D~hbU%^5z IP W Ζ("`ZVgJ{#W(iM8֪Ium[ɼJG2h5jj:A}Rj4a)8}u"ORsL@!RG뒛*>Z2YSp޵uu1)h O3 d 5vҶ3n{̯_ܗruut/}Ju"H]jrȐR yϟw=*.#|/4מj^l 걉f2Prt!렚JJj2J3xȼ)9GcB7tM!@~=#7MAU)װI"tKow[y5'AfQB:7,ޞypTJ'8 _7!sMw]Y"J5Jc7.XzKMs3WՠÚ RWz 9O(ܝ~1qMД D%o7Q{/9|>hBO^yp0'+s]|x\dn8bA/x$ŗ%Qz0&1a`26 ͦPg &1I3D`qýgJ)Rcsgiڱxӎ֊teѭEm?vu |;!8oR1P|3g 't|vkߌxx@y'~';J3pȱ'19"`f'bXT !>@hS@N"| V#\K#1Oe7F}(B҇Q4/Ȱ-Qu/UvSŔ#Qi|i\φ)x(ۿh-C#$1GEh?J}vr\5hHvbr9 X8ɹ\*kJI@ͮxElBi΂S .F@^N [Mǖ>ȼ>Qx1xKnhyrH7KOݚܷ H56 98Uo f`Lu{7ZrQ R[iIi$,TjUu}%Myɏ=ͶHZk覬e]y˘(mo2/w@{x9|.q9oj (gv l-qrQ);"=el7{4o񬟵岳>jշ;Yo߼e/ 7/OFPz]g pٱtTr z; Є,1RN^̛L ϟ׍]GoDgw5J"@Q2Gs, F0J0Ĕg\ Zi}H^HZ,j5뛩Hôy0gvzZ3Eq8jBfd>: *籨RjdAuf\NELb[ v)pМoVv^džQ7 jjZ$gar2S=. e0h ɶ#+Yo:¨MSGdQsU2I@@5e:>U6Tboא,@d/C4y;mz}^|n᫪ ddӦ׷]^`>ycJ\yS{7'!@w]|p]NN}0QXt7\f;1horC0 }[L<3h#mEzݮ{J0nG<]Nuů^g1њ]M %Eu hăb UDeqQةNJk"P7[P)]'e2lܫEX))rc­zF*bRaM{I)r5E7hz;=^{)  ."=z9e|-Qp&OFikF#jGZfWY H\MjɠoqJ&;))#f( v7CMNq;dn^3g[A|oh0kS4O>:ɆcŸOLLjrЍ!(f uưՑJ2@;JJQ="-{Ŧ髪j뒍ўGQ)i% rMuw\=mh!M%qȳ!e\}(MEfpvFlBw:tY$:ROg0%`nWD1hvїU0xj>XgTusslq\t[쥵X{5ΰRD: EDKUqzvb@kAGɫ[BDh]D`+]UVG9d!4̀f:⬜S!,*Xf8p$V,+X*Ѭtlkn4A, iI7qLid/?1ɪ/PeGs[܉<7}.ͤ'"Q6]N.{׉;-yq]e ԯէS%x*cxC<>f Ïeqͣoq[)Tٚ~}.'Fp=1yGeWgFU<*aaYvQ}ɡ p_;ka*f1&#{V]CMz+H!5SS"'v^j5UcUChQ_%~yW2h~tbٲ̗.*g`: _>6aVx q: 6m.W|f׆prs{)8\̈[^ULgMYyb"f߆m]>Hr?_w(a>=(zo ^=<ۊzc?m>ͅ۳3[}kwhb;..lst=etTCM[{^Jr=f)2[[N)g%$_S9SgvV3W;՘/ǭ&q!!`[{R >* >(x{~j<OsWcc33^hHb{kݕ$o!Dic3h OdÅSJ %l) L*1&7:d8m6\7s!C{O<5(psɟL1 ﰼ7:?{^K7oWh=0);jТ9>@S9@3{pg!J?Mc:7>t^TC 87طӃ2u=nSÇ^r|0M$C%ys'l}@y4TvH[u̞]$C.|1:l!8{!P:A}.썟G9՘/&~O={/́[$3jr*R[%O3~fӨ_SMn. fxmgsndS+m?{Ǎs΂6Y,I9@$ ^-[KίHfzF3=j bW*%:*ޞzʫ k5 tk |2`:5\~gfW0m3N66qh~8ar@4Cۊv;y~Wi6":=:g8ҌQ1=b ZspLZ\*2XgNgPRiwdE6SѶ05j\ S-ZEjMiG`Of 3H՜7yPl#FH,t9yS]3]Ψb*x_G{Nrp~s7yƁ+&o5º1Nzr16sJ<)eۛMݰ>Yu=\xx-8{WU첰zÙ3wUݚtpGK*%+u-;t"Q&#–=|&#Gp-V ®Fs"h${.-Y6YgzfUTA,TX]SK:= 9w٘\'$_X۞jk| k~#X9kI{J] mMQZp:b3t[l|2;>Rhzr<\u](ڐfc`gGBeZ^$#ŔMQwҫ` BJ>|̪VF9YocF s@ 4ih왘ᬯ,= w|Zkc |kx+[jh3\pjD[>Q`&C~L]8Xђ%{% p%CUਦo(d!2F .x#SfB&H[-{,?NsR$#]2>l!ɚ0Nfv+D`B0۹4ʕBD9X:v&cALlball+lŔ>)L֠dG]'!e$ ,+()C#zC"nY#|9AL1 ?.c!Ԥ,R$@V?l߬JI&*lB *\6B6kg*L٤j 29IbD=I7>؅?\i}6Hl 9h^j^&=Yr۶q,vN7*`p9FMnY{[K{^b#{;/ 1lqz7/[PY^\O JۢudNĬa-ȆP_b:ո |@PєEX,;#!`kɬmjTpbV\?@hXGxYluDMˢeؑ'WB,NAziRdgî/6 {׮+% dFVUN*A~yҰQ۵'hx>YhZ"~65 oo`"5'|LNڱy|CyVɨ}*iEwgœ}piAyVka;bV™b6T$ōb`ҘȾ 6)6\#NNgb 1!y ے 3By6mm$un#`\ê3XrƃCMI :xpL'Ct>q͠kޞڭ9%ak`~p=li'D#. [nO'X!LGMЇmZY]s0FSrK%${#JSTzC)Tֺ*N޶9-Q9,x.l.DQ3AT]gQ90fuHZ r9kUbP&\\P :m-㴄cFXDd?1Ot5/+U y.NKUjʹ %XseYR!YU)v͇ #h_-.RyWkr~wmc2{a-[Tn|1-yMaMhLJjS WvVܓo{[xhdFԷj1Lr'd>)S"N{Nm?|إX+ɒl`-57] O0OG_3 h".eΔ%X @Cћ`d6:śKsUh(SS^(ݙ25~wH\+>gIa{oJ,,:0J"%Vl&*T:6;l*[D&AYEX1Ϫuvl,.JVF&Z~QRN yF"?RW#M予ݒEBmjo#2ߺfCqќ=T-q`HsJ~c$E!;@h))j*zOаL Hd (˙sØt#lK$u}QABlR ֬!b96AvTEgLuv1OMN0ʊLlBXރE6ʨ/!T`ijٓTLM`Ja3 R2&z M>yJ'k{,.;kS3fR]8=NGk^ݍx՝qYǫ6WdGLiL7g.3Y|MvƚyZI{p. wlI}%h6g/Scz$o% فrj࠵\u8 6Iz2%o֘)id7~NA)oStWN;iZi;BU1]M$--x}$|jʉ5BŪ\e+P:mhLaqZk|`5H\OEB0nqH"<=JXo޼ќ ?-uE:"G=Ƕ#Eǻ)^zokՎ~ԍFD|^0U͆mcx+3#qv^5kU_zܮE*]߮N C_5<:<ˢIኵL{q9͛ӎxaя۟N&͗7v tr1r:[[Q ˽5ĠJ( : YU nP<1lO;K0Ent١:SC$hi6ZvStWyI=V>"s ueZi 3. Uz!%kzŠ}~ޮ(,8)4ɧ޵WCg"A2CRwFUO>ٛGd؏kuSǏ oV/#]"jGk#Hj^|FcF:g[9%)h Eoo-[2G{FX7X%Mm%#KUHN68Tt\~$yz(N*Jns05d sTjCSteQ}-V!޻έYs0@Z.yz A )LˎVs Ik+wvݭ9)R5}<3n)SR!R%h͛?^&(;3]ź螔rYWd5-M3hta/3],MAo$ dZ~ >A5/04N1)ڧ,[-i1W&@s𻽼EN_n Zڦ&V?5džn>?̀y|ƐI/AQ5;KOd {" y(o1y)y_L[;ӄ*e5<4aAJb$h8Pl!̳wH֐`,0`2]>;}i%)nlg(G>ޣU :KWe'(P.QlK[2e1NKKKf9ci'9\d @RB8+]D? R\ﮜ3*b ьf(0|/kRŲ&o&!s{`A2mLcrbr=f8B袳*괒L1y?cѬ4+Z79(#3{׼ f,Kd63toge d{gKEj fC{fqMhqo=J K6qrpBKUgR7aMZ Uf3=(nKl(uK kʥZz7B[fwOe̠J%N!kD6XEkHmH$EK9DB rM5XY FiBTYOs%:CľkK m$QK]R pJ.6XՆ XuIO%Fu*[to.)CD2xRKW b%m]%M7-nH8X1?ZO{^J8|**?'K#m=S}I yb۫|dG{ bq7*pmQf*Bè{踈V7{l2c]M:`t{[{E:G+ڌs()_U%{LPE$jq뜷Rf %;*Dvchq?c4>J>]\ s>TD>(=.kS)s{=ɵi'f(-4=4M~TOƕs-S`P%Vhxz'l%+荬TyҸaťmc:{W"NX8 O-"R^8߆b6HUSIbPcQ o鶖n{)tJqO9btv}?iw9Yh;=)<%>iDy/Fkؗ`nDmp$e'5 eWvb>BC%  ep$thӏ]BV/YN]agz!iHx_5K`WyzwAϘTђ fm-9_8UixkJF44Z 8kʜlVyJfk0yIrA,Ncd"Ń2FV2lݗ9Nqvί=+m" \@цAWL]rrɔ%NZ5[ 7 H!wSyگvWxΌK?4`s 0zO7j,v H9KYcs=D_x&VSĻTpHB*u"5$, [Xm{Ca?;?;SL1ϔ6=9/Pj81└qi;cPVH6k++[ߛ<?KlC{WIJIP|) kjyϋ7o;j%%vbR+qI.Iէt1JXa#pi0RkR9CjJଘtǔYZc1*'U)*8;3S4cRS❄$3P d0t[֤v۔5aʚZeM6^2kmYMS } LIëmL]}ec?X3qWE'+l-}],LM1J*M=$E:A8V닅΁TFw^0 8`_,B\\pKo|r^J=ʲcF.m&#(?5S! BNC,r):BI$bH|>(P;_lzzF&>zgǘD492{3c1s"Z{4yչե&#BEC79zgX{ono盶Az"WqݿwG?^\6<5p&?`6[;9.qSE2N “xn+7{Zi&y ^_IP |I"W* #xr0߾|@=` Mya1Qz ļY=V> (E1۷ZPc4X5P(^~|SF(Mܵ>fwձRB99*I'ExzM3 k6yN6?#|G* kx_&piIp|ZֿNki{0$,^Dd'ƁOf\f>{UwvuXmXBsuHW}|Wz;rEUcs>b|q=XI^5֫<\dū?K?~4䊗;7$]\~%Itʮ%XڊU{[qNe՗^ڱ:BģԇV?\zw'V^0:Hő?uk!&_^}&D^mpFa/QnMMLy1@k7h@k U$JdB[KB =ƿ7-E%W!7T*aУU5EM]#V'lj.az$|yTx%0}9-vl/_C@N{ $#ѳ}FKRmmԋ?fA>hk],mwWƲQp$*XcbWjfCu ZIhVc@Dy6(CQX($͎mH.w_w-"T?r{g2 Y\wa.oYHP % 6b!'CJ6cl#a] m5JmS ׫kUD4gV ;!ѻ ZlJ{w i|"TKO<3^ςHM&Hjې:^2-Bet*\D Oʗvtś2zқ\ڽ9jϼG!謭T,"mn("E`İӜ.q7}K69S\ eJaq*UgW<8 uFs2ut%Lc LJ.Y\*80c)健jsϼ4* }1b Sì}[(/<>_ u Dλ߲fUrc_Zj%Jy^NG_rR# :!PU $ʹc5iIMLK1yOc}@j+ؾg,quHOMhhNߦ1.^NnF{ ݷotAi: . +FCDQop.OK&uF 0'wyU܋.JCyWg`G&3G{&}_ph m+$Jݝalߣ@-Ro+?s34]G,lf7!ʵj2gJحg- }.8bzѝ`|WFC9:4Ai&x#uhDb$6QR:16CP N%41G2jn4ڸ-Z.v HG+E_ RCg¢x.yf4njR-;'blFUpݨ$zrQʁRGMuQ uq5;`4o¶q;c:ZFv?`V*60 u=3`i$a $.ւ l< R`(Ɛp]` MA71*>h0L/Z-В>σܺ|պHs͸,>Ëޣ%s|Yԇa>~dOݲϗۿyShLQO.ΏƇ72--e\έۛI1q88rAG#Kd& la~%Q&v!'utJm{VgՒR^/ڭϝ*7UP1OTsQhabLa ~0VU/J^1n4VdK1y:#)fG7\,U;@skxj"K3z iFj:y3O4E-T(S)s$XcN˷iXz~o.\Oum}ㄭhLY/#lQLMJwmmzYA6H' 5%2$9?CJ IY []ꮪǔɃ(sFd5ڙMSv1!\E3tm]4KJeXc)0-9(ET{[Q!T-;*t^Oڋ2Mw0TwQj/>OËik>Ӟ>d?Ln^yvReS% *雫+. 71z5gW~`V ]p;itOZ(e{EX!;^0n|ȺKZq⭻;fSӮ,*EarS!E)R݇mܷݛg.&AZZu>0&nkܾup=[wRӯM3Mg|oFW?15s2kťdmr4 io'xUQ7 gk;G6)EXWJyϱYb|C+{{2 [ȎF | 麟 aɖ?I)9yQd<oa1]+gVߏ^Fݏff= >#5lqmHk6GK@)ObN3-^jT"֜ uSfۀp% 9nu"u}iPNM”p2i# F~0^$83]܁;r_%ӵ =#F51!Ho",F!VOJ(in`| yd`yY IX ,AkvENQn"9 /ؤ z6f9DәvxZ^ `~h2~9qaޓth&WÁZ }NWsLWʚPLbfkp7OyC3&>9SސOJmN]޽kAO1K*n'&s՘= ʍ5'a01 1,yM&Cnr|* V߆'??LNwu!a BJ'],Xc%RéO8aPlS( op\3Lv M>eC9 'TtAjS#=m$F y7CǬӿ6|r(y0߷˗h|;G|b6OW\ mņQ:ruYF ‚(^l>϶ .p<{P@8&/HC[=ˋ=H+d [6gni fVoWs&N^[uΌ{άeV%rC4!xļș@ LF8K 8 zO-+Z mޖVR}ib|# Θo{RATl ؾrSXNls5tDͷM:r<6v; PZXn~!Ǡz8f~-ƌow oټq?靱Q," 0W0 AǎW3Ŗ1P%dʸIF ] mvI܀zzd("x,a[肕cR *Ŗ1e吳&ʔa6ۣ(Zgs"im~Mkӌ 8b< vXs4+Y{aN64ZۿQA[Y8Mb~ | y1}8?c5 6jdyX, h*X{Nb?iɘ#[S0%eB{ɥY-c< 6&q#==H-ʞ`T"Q,4+jо{¹>*Ӓ$rQ }<ތB~TGRLwǗYфv$YeG*N'Ź`Xς*%,G..ͺ8I 49`BHI{&/|aBLVv/WVRI,.jD-LvE$j'-]4x-yEΟ|UP*9Z"g䗈H]ZޥE]ZޥEw ~F[P!TR((3#/.W@yx/ j omYXĻ]350n9ldR$\BKe)*&?~{FW 9iQG;p ?%*ϽSQ;B $ Fᒈ W"8™m-ϸ0EIQiqzqq-@ -9bw)`%B%ɶDٌS/խyRyͣ#l@eR!ʞ6"BՆ@$#y BäX (IwdB-$g &/k6q(U҃ .5Ǩ[n8S:lIX0`y~4T+)(ZYhED ."Zpт:ZJgq|W 6,烌Ua%"Hld;QPἧOӇW4)w;=\nH\|6JĄXu]';>//I@__?k,t5Sbߒo=E7 tiJhe qks[D`z1vb3vADl.%'0o'EFZ<1MRJ~ eģ:K9b/,MN% Oh$-/UVmMJ^%%UM$$`ǫ9{ KbQq}TŢeݷv1! @yxY?rM2ծ@(9K e29ɭui2WIjQ֜IXP0 $ F RU1A^DDIEٴc-;9h0a`K×rDQI]23jLr0d`fZXn$$^@˿$4x!ihHZm+APmQλK3z>`׷)TFsYö8rekВv]uHq* `\KBt!zQR^|ӹI8=(m |za7n?ݦ8L huv-tVJa* ̉!Z.0 JD懅ڢH{E{5nuYf3xc˟~ٸc3T_ &q끷iϻї0=}Ó2L"յTY՟;xN~' ΋$w$ /-²BSǂ[X9/Za-bK$U&};$/P ;Nf:I,λ_p\ '5 yNO.Eksp+,bE}oPR8h$r:0񟅽ðpߏxߏyn2yQTÜ5H#1(^J!2%R.8r%By_  %\cIƉk8YG97w$D^HH͞yNP%ظ) xV`^h6yNyF'[vt %KfcǗ"_V#Y}ȱKlȦlȦ_/` q !UJF $>Xn̉ ,)'c ƩRK L.(;w퐉kW^"%Zk5D: MDX;;jYƱӋpzZ>7Z>w7]oC*u-EAL/R(53Z%(NRaml"84+|&|Ղ<~m@C 8lB RX W*nRd7ٻ7n%U#//Ja uֆa1>貤wb;4_RMVŪ~p?vVuacR4D*P LˬP"cCjA-wN3 S",R9M L`坪 wa(!,IE(&"xdl Ӛ/\$ #/"e`6X |W~FnP"=,oqW ^jqg_O$JP/{w7BKR`Efۣ| a.ͧ Bfa[`M:b(u^\C$4>n+xr7/x!kv֊a *wB/h*AjthH+fU;xZ7\kQ"2B۰dXVA0ƱDƋCi' =M@Œ-i|&RIjpe6k] c1hU߁I^&iy2 qD%"r dW!DꆉcQSA0;A~ ?TB Y=yLS5f[kTmsXLBMv)cN]Ҕnnt/o^o{/;}qJs/B<R7ʉAe=8Ŀ% *X#<<`a=t>y4^UIBZ !N# Տ?j|$Dң+38 L3( Kl)_|q PBafAT,ol"LzD-xMBep_j6{ШFJ:B`huT l(*pNEΒVdÒ昧BYg&uo:|EҨf 4L8TT,IWǶw8i%9d"U߾|$}`VgIpR>T*IO2ӦM #Bߐv1V[;ٹq̡+AW7*˖63v1 UdzL"&zI.Ӣ \N8!,L;\1CY"/BIB> {=\$DTU׽?%ߌmN՜EKWs2 $Fۊh V[gh >ұ29,Af+s/zo21?h0μ3lb!\Y ꬬ@N RL8[9+{r$ނp=vJ b%q$ M>gykq?q\8{_apA *T'"FV菮؀ŠFSG$ ^GZq,1֓3oD6+:|FωP~΢ų˜z1*[4qӆi>khG3+/{m)ZS,Q$#a{]{tvQYiTWR) E4&jaK YHekzSb%.bd3uJ_X^]! N'͔64lQKK 87eHv/z|+:i56ΣOZcMiן{U6¤ [ϙ4DBgMTAOUțvw3ɕuWUc{)V-nay%S}WXx E.nqkSkeVÓnB*΍i-S-}03Ek5+Zskxbq|⾀kɹ aF#KpN_hҋX\Atq}9 &td8G@!xZ{\ўw,Co7|(at,]qގ71iNȱ]7vmvU^YçNǫ6'v}L#ivNgDSUC6( \+IhyѨ9\ⴡZslE6N}Č1'(i~o{2t=+Zf5KflkUGӺ,K0b\kVΕpexeHm4m;hJآ-&+E i늙$Բb&5fAĩ3z6vil(~5|̱( .yTPǚ%p)C>ק)[+i4+m`O94! OR=F%p<ꜯה햪U 8woh9X0jq0CqpqQ1d,!yJRNAJDZbf<[T Zi~bmsn'Uj kho^S qGdN$h:Zrp*i KtyEcBd3sXZY?0Eoǭd!lMüa+9*=E^ݛwd83^񜨿X-et :b,TiKM(P|O7Yd2iA-$5LN |9IK؛'  ϒ y8[ 0fo>~ira 'q+waj(4@ZJr/*̻ttjH RJN  BoXt3Q.n &ؙp+^T-^N&)W TgYUTHyK>1z$w)L&p{(+о"+ZG^SK(&.yUq#\޵3u!׹Js,ap,'~y[Y8s=3Frӱ{ )N ̐8L?:EMD8ժ ٧HV:(rz|چB:gD@%\+#>8C~L0$g3vQ \:"74&8UF 52JKrSTm *׋/]Ǝ_F$ T)!9!>1a 7DQ^ c-ՄƋM|2д+}`iu^}0Ctue_5 F;F3릜vM9ep&^(3'nCभV cpo}bsv f;5ɩ/7׳߲ A%%ӊfKtK&j #x&gmBtˀ;ozĘ)=#mgc0 C̈́c~D#’Gsb^8rCE'(Dubv znF$zNB3F)s8|PAnd~8WZ T.@fL稜?)jJ;}wyxH9o<&@Pя&krZ=yxA MgLrČ2 =#c c<þd"<+_E^ruhEgoTЋ>r3'6l󸦍 Bt[ ezX=Ň#|pt2z~='y,[ttLvjg¡Cr9sK $;Ěac8O #z%1[| Kfp֜PFl}ǝ`+qF_ū0,%"nOJ{3I+J>d,Y1f0Ӈ,Ȫ_bIl+Vi*}S { d(u\S { dw,xZΜDYҺЏ[| e0#n%w)Ocy4eXz R I*ů\߬ 3ԭPJe̷K4ά<p\xxNBA3ML. alkI i#ii}]H_]9FZJqLӔS4rLjS43ٲ_?H0&Py*vLgD($nZ`EN]Sԥ)(3B|w OU KBӵϩo*͌>9'ԟSƩMap*GhĚ 2#(NRBOoc@}r?/Oɓ["b 'ANVrR(x1>wE)Œ2kƀ q'N\aqfё ^)_[=.=rY Jklvtqr0yKasv`sFZ/;+uEw%o8$ ݄ `k*^'|1Qt3Y_B@ p"[[.tWGpT9g3@ÜgS\IDKW HPxn,8Ӕ •e Y,(Ł8eq2jP vVObO$楩xϟ so7?V2|x6VL ߱w_-~VQ B"y/87gLK4ْnBF'x{&[s|Wn b[-W];0&HM08]ӛׁ+*: ͷI:B(7?y )Rm TIױGPzV#&s-w`D[m#g`6Uє)]O".t1e!g*3IW+0t h%hVK>ny'"8Xxi.eJ ]C.veOB/.GwCFl:?Q_xԗՏRԦTjkpo()4SQsSc,`6H1S:tR gxl:l<U˛ծqvDXU 4PVb@ᵥƘ<4lEgh8(dh\9@:4Q_iէS$$Hj9p7aݴɹw2eό<-84LbL{JY B@,s2_ZpabLź .EfJ)/|$kt3[8@ ~0] s T:N@)!1PTFwrKxc rE2ꚲ<\P"q=tɮn;' jī9Ø"HPԊԄk tP1%Q/xiԈR=r&a]ϩP0$=չPcбR͓k̨֜HXo)suQ(2F0Er) )nGmom (i`RzOTۡfӋ8Kzˊ\vZߏ\ne_]k/=fB햡o6م7-xGrWS4ϙ}ZU3&[ڟ(*_j#sʹ,]hۘDjر#ZSF%=P{,$!nY05JN_tFSb85oŇNDcNJUR|pS]UYLњ1!n'y-r]pW\^y%i]z⡝2">$JnәR\{hp᭍*{(3gnʿpv$9N\RZL\ȭOAZw~u ӆ|D;U}D'7 h}3g32AwfP׉Vݕ$_\"K>~~Cpo1)HӳN$˽Xߒ:y%gB^) sZK/r3͍ʵ!/@ LQbֆJK2V?XroU^3KUf >Nޢ+tau`}5=(_ *A)WI]zRBIuw\ TfJ\]ס誽P^ī]ףj08ӯM(<*-=!Z6hrd\߉ N.`X!E]PﵭҔ%َB6ZÁJe|0g56nF Qt(#&OD?w˺hKKƺWnzrk>}dp@nW3:IeaUfKZW_,,L7eMBeZog^d`AhpV w,tJ#P"Uc >ՇEiy5,qX^-ZWKDH{KQ? ]}JHb1pP+Ƚecln?Y7_̌bPg#w%#G#70B [p-09-i^&+-0&kQZzHdNm<86f"4nnU7>^n&a8!zzp)n Z0S;\_Fә-Vm ;|VaƁD1}MX}f87J9fSsRk؉n!9_rvTDi:+&||XC_=T/ _2$9$z A;3Ev=t Z t 2zy,*Wγz7dT 4n s?{U ]Mby$Mbyݤ^^7f 8e\{b/ /F H)WۛxI%^ /׳}RJQS%41㬴f-- N8>`49eS29s-d6~ڱX,;XqsDznCaDƒTMc&@:ÅW!'Ti"0l'm1L׹ˬqZƐuw&5NmXH. Ӫ'#P.\hJ_GeRL2DcnuUq?CY|10:>)Yَ1,OyGiwey|IeOI;8)I?,;Yݫw!>XAvZ=E4Hv t8W複Sj;(Qp?IlK /?Ag;ZmW9͜;cZOUw35EB ATp0T8^P(/ 5'u`Ig#2m]5n;S?q@">PO n "!PEn @ #rLs1uEC2"!@x8`ڌ L~  !¯y4e0Vx#$PobI'e_L>w qTq%nK|A] j8߅H=0!xz l~F.##9Lr#%f3%^\f*hap0- ][oɱ+^N^B p<,v&/D-I7?CJޤ+iaCpn]]SUwoT UZe瞂TX1&󟻩4ro@aq?WzqSJ As!,mg,fedPsFL7qaV@, r3d}|>=jHv `q ,5^L RLPFL 9glƮ_mԂf8.8[ׯ!;G7kՔ|!q1pͅꎉcHbw ºJ|? #L NXV,W/ Hc̟~3/տKi_ d/rna6F|-ZQn8iBA-Hp^]@R5PZAӁ䥏4<:$ZyW؀v.؂^Õ9_Ns{΄ þ?bPBkx&BGMbKB`%$dI)\d,W auURw;r&ѳY%lUd8mכ!x#Ihpl߬}6@K K4&5&ƄRNj웺egG١(W=WKYg؁ze}県Rkݜ3N4Ĕ5yǰayAIުJ:o;IŻuv\! 0+t\)͜ӾEaF}ә*);N+x :e"r,CG|΀)64t]) :gQ3k !X 0@P^Ԫk#,!64HgNcJvX -(ݞ7|գ&APTFmu?.Ėjsȸ.qg[C9)bOs.k# MN)QJIOVX-B Uyn4/(0C( 9i F \hA!z # fl'W8ށ[fÍz2% kqF;ר%Ɔ SQ+NCdZ ШCY!išͮ -Kz+lT՜u{ @omz-e|ۦ Ѧ`,RAK;[7}|P_;ǧ0Yi%ld}Ζx+EqqC`,P1F[VȣKIk<[D]jN5'f]H_gS0aֻmf@˫+)q @12I[ M2LƱ"Xʥ1~0b4[IoD"-iU[32$i`n Bf1NMؚ2{o gT[;;Ɛ#1S-")K\n6ɦ޵W~IsU'TEUbVw)-,idNr1ľ%axH´Wse^-#ơ^UM RyƮj˫77s{aGkWsJhӎx!0k'NTݣnw6*Jy3EN-Å蹄ՆAZM|"~*~+<㩸'Ꞵ˷DyT&yOEIVD[3M[U')T#zj$<=*B9NzSX3bm E&hLr瀸wꈞ<>byz{_cCJ<ԭ'M=;W! i "Qzb(p;8hjdQZ>YtE4&US3vA==_1} mbXn'Oz[^=ɤkw mMɃ&q pwg qjD<9}EU1ptf@^ӟ^$o=ctc:EkAFW@{ W!h^e}4=29zYEs|sr$e@H7AWU OT S_\[:8TQݤ)>.n5O\ilG0 o;#7^ .zq`_$8_0 V(Q4ϩǐ;v۠| 2KpIZ;,ڢ\~]%mGDs%#N/ QoxBޑrl׊*atES8*kg!A3Grl#T>D) sS*Rn{j9)\zJ]"MqOP:BF`΅26UIŇLv(z 9165NipjڪS;-߭$- TqW;8Uq70;*xY˫~2M(00&7(-4zmJ&l*XN?ͩa@!&3JsȚ: J55{e՚? d"vJFgA+==M8ڀP#\4oP Rz3v?Tu j}o u \TaZgCW:9{l{4`,3ڋ[tU4+'G/8G*?ttb6MٸxyІd H~_n؇ZTN+rM1DG,.8Ph+'-H\VKgsinp-SZD;S Kvv&uSfn }Xi(VkGD-c$x[ףx㈈hi.s ҔLԔm'qVKU-P9~A$ b1Zmԭr_תRl $stיҠ2hHg+}(:VR .{õ<>9 qZϘXe1BtEYqf|Jir0_R$6p]==*k8`=c>_N_/WO/(a~%cFaLJ\j!`o7OhF]&O3/h6-|?_gOwqIax X)?4W.B_BJIb7J6:J5J]W%#O+bv8i}]TƬ64'3%nBl.(+%sм#Q\V$u~?ˀƽJ[/ Ju _GI!ptA+#V}Kib`xn:lI6&X=I4 |zSQ nT#~,np.GۏM,yisȸ.qg[n5 דbLwڭlM1PpxsTιF#.xN+79E0VX-00)d5"P0r;ƾv?wR8pqv iPPaj\9plwb$c v6T{noVy"_7Kx̦Gre)$2w/x~;ec?nnji;N!X/ƤA(2Ge+q9k`czEa3rT%l_Lir "Fxp Ol3G&[7(ZݬmݛN#D&iӠbc}b[B  L9Aq!#xFJ+ꬂZXSN\yNcg_xzڛoVyD ~~,O}"mDβR" 3JkHEƆQw9W~ْ:0u3" H1nq"e):ƴaA.,Н9ʁ&b%_3V$d*vt 9C1ZY@^qc[$U1^A=sE΂#rje%N$+tb$E1>#p[bt]惧.BE N O$.[OL6:3.( ! YrL  !4 db]JEh˙V]+v 9dl;OY&>l8B)8):fOmoW^16/@癫Gq~傷,|+H_}+ޣg$2² !槫cy,uuҬ;"[έ/)S3]{>6pLGwWǽ~L0aTq}LӿM'q~0g ! j><G6  !)4"%<4"∴w`\\ nvF~2KOY d1yn+=a zrTpQWNDVHIBj\?%tg_Gg-!Qb8N;osyZOFGxu4S5 .U=!pb:G4BM8lpP'桷b[ η gt)ysykt@2F=tBQ [ΝW$:up"<# m2T;p}XOZ_Z :΢MgsĸQqa÷{OYI/^ a7rBuZ1\ۧyrWBB^{).VeTJ|ϊX3-Ӝ0,PI5#KfFߑc4"l˜*JXخ%;BNJ+2PPwv EpA576'FPšgZWk4bA)˂:,BKMƕ2".KA/ ո!S[X-C8~TVZ`m1ZRq J6P (*άc1#X }(UDr}!%D-*0p]%Um-9\~VKSY[S5dȩ3JӶة3F~cfyX d7n٥vRZt`^Sb*IrǧͤqP)dޢ{.?&xIn1Z..%!^xM2M[zǕg$ ufUi't u1>K' >I LV4 LWbҢg*0{.91i!F-+Av#(y& ( 3wE󭝃\B#,\G fg-G) ,P^wM#޵5m#뿢S{v%q[[gs⊓KR.9\<ܶOh(II%3Dn4?9:APo>F͸\]1$ `fXJ?|qKb͜H{u)$Tű-F3$(FJ(K*r[0v +B̊ή># ӏ\0sH$(X<88 X*$ʕb.1Tt DftE9)$GbŅ"Kε$D#%](2ZKP^^gSy-bņeV˴_yY*!"=~y9ӣ}PuCc`ڭw"ƽZ ]~ۆ{2;Nz~slkMGBh iBVoAR*F6=0 $&v$ʆ‰U&QRw<GgJxW[tG@ owZQ&D'J3qּ†\Ulp;WYٿN1t`c4Ҋ<56Vfc똢~ȪVL40!j $[sNˀ?1jJh%9&h>FGajNADɼ#Ԩ4׻cGw 5llʿ^osŚ/KoOm$~௛i[zrZ[IƧYUzYzaئT%M8$ 6bՍ̴C})4*U=R@w2tz; sduf.<KBFOahEޭJg+V"g_0jE3&mu)SP$M/oi0P9xIģeyIjè8(yzmHq8ց1! ?&}?80 VZQoD11ܡ{YwxP_ syo/J ]ZT *^;12d\҉a?.2F #9)goÂ!ɐC!ݜc3"<@(ށsU(¡OyIC\Lc%/2|vSV:#9 a^ kzˋU c/fl}Fhò~G^\]C$q]b7&keFZT1[mx?_,>\>}jQE>ht}sL5:mٻ"X9WY*6N-S3}Z~r{*ᦦ@hMKn7sV\h>\i]0Z rl;_R@ބS`7ZsA}F@f(6ьד~ MM3\leu$dʖ2$lApNS3 wK*hYӭիWçZ{Ġ_|c/ԳĠYIbjm| ?7]Od%cx :L4,E1{p~sH)^KQ4bRO-吏CҠZЄ0K7RffbTU~4o7X>9kQ)ZB)9B4fjG|rR)Pc_\^ A+J>"7,CCkMǸ1zw}>{)u9eNU._)?^m8H ߿mqUpq3j?pA)W sd@&)1x @ug4Ϛ(:V7zmEd7͑RS|uqq3^(C^I.8/rIR=ӠG=0U)vVWo^}5ԋAb'ڍ%Mև.7r0\}ȓ?rbn f<(/V~8NGv*RJM|*R*A8KAޑ@KRPc78faS?EBdVEgQ1)׹B2~1_bֻŘ.Ƭw:eTԮtJB*JTɬ/s_079CfҼ9u䵬X+fz>[3rwAd5S͇nyZd?}+m{T(L l: ?ņ:$TL^rGYS1J {Br2\Cs-ΊËgثgSlg>y> %>S§!_xѩ[[۽R ޮM U6 YSP ZY eY*Fd̗t}P7\x&Y:aV;]C "RK"R9iZchYD), ^ xH%!ԃ&p0̾ .Jqq67%._07K{}uԿX9^46Tvۭnzvȅ=(dU&w(5ܓ o[*r1mk'lgqam3T&bf#v=%#ck`"pFxpvG>kOE:\I6] xxc@]3PcWK+1^Yϕ`*mcYwdo$V 0Uy;{A;1}_wn HJեkqJu&[UŞ?{Vp3I 0\ P9Kqaf8 F \)bOV|A3Pgu:fHf@Hf`&fXd:,t䧜J :V6֣s"`s >%Bz:XABU hTAm6HW \ i+Qmԫn$6{ZZU+lo7//@ $hd[$ |Nսl4PĄcVSIV1# mےͨ=2$2l4!ln׫jUhZw$QQA@3ߟu9WU9oyft=j;., rCh{ˋw-;]=Wi`a愍 {㖳`\Br]W.~=o>O1p7Z=e+ܫllbN5,feT5kj4vc.w\-HA)4(k:X0S%$[q̐3)L!SJT|lcҤyTp:D{z;HC9bZXAJ[(8_$-$%Rp{ aX7  ^(^eQzE4 \ЄBjg~`)6Wf):!-믿㕌Uv c3BSc%c*H(K f})]DFI .:p Lv}]ƋFѧђ[N^(ku~7BK#02"Rj *X191SV%@G9m$=m3H )$ X>w?nH_-v!$=.'?l Mm]4V'-Hg0#M7GVb <#lYh^P$EQJgE/e+O8| {ez ~ #࣠@vz-$(gDށ9OHՆs)'X0ޕQ>rFT rryrf͉M6ngΕ0ZH9 1C/20B 5y'_zA #1\8<*XcgU \@'E,1_?BJ^q<|fS¥x8E:T|u>O t>io ԟ[P[s{q7:/b.t^'^icX7G£!ePL᪑w.Pۗo^{XVdG.D*)J=RXKQ"RpPCᴱ)URCZ4 jiѩ  iI9ζ5[):àRNѴ;3 isTChz) +RNwR(toڞc~NU&}Aac`5/&)I|Gb""V{uYC!p q?OGqak SL 4F/و*V:4sE̪۴`ূ~ס|2[.# f#0fI@ XHH ?dl,@UO1~2Of+o7K?Liǂ;'LxBB e(.5cL#}HC{*E3r 'Y**b䙌 fdžHG+X ",UT)VN0 bJ|}THXpykPSt +'ɌνʨVRɱ&6XU X:VӂPK?iT;;6/F%"U}JV KCO$q՚zCӐ'#P }`dw:5 vIm- :bw$xu0rj=هzކ а4cX0Fn?tj0֯n/m_nb] aDF0IqIH/`f]>J$20~6O~9?}7Yo<{YskVl@0Ɗ{#lvDF b7CQK*?.CXZ^ Q!K>U1=â,c5I ZfU!U2*G54jxKr=%&Qo``ָ = r"o7 {=6O19ug* Vufج?O[`Fˍc,̋Yw̰xݧe딗Igޥ ]?{uH!̤j3.v.rR[!3ZP;x*f}3pc;uϰ T1zqrh;qO2I*Cf5pRf1EVHOy٥DG_V*+TEsztm +lO蠀U*@Ɂ~;8XCobί7FD5' sNf ,FSC!k?p]NCLĆLBhC!{k#γHy=X+]P>z:1x}rM?q(!7ːc'*uR&i+]Ձ4,+!<15Dof0.~[N~7| =\~x XM_/ARsɸ[g s3 Q `.\jk Bx_ɿK4x㷃Mygy͋6sϬ?0//'ydś` gM͛o^/W#y[u_?u/Fގ#}/7_m^n }n~bi&L3 ļFRf|`cS,ƒb4L*"1`̰XiƭV0 ' Z0ӭ\0Mɗddmnl=9p7Lq  P6w1AD LB1X?SG[aoRmoLoB?:o^į0oI&xww@/Ô? :zN ُvave2! !$t Ypf_WނԼcq3_)-8vKp@ҁ~ ʐSiP L3u0^1mPL+C+h?SH5= 8BvvKIݎvԠV2]kE?8{Tr0s2GQ^3i7g5<5:O:SB ^3,$⦅eހ5NN,.O9| *j7R;߆lgo>."I|8_JPPi8:]W+TDU U>?jXy  ʖ #Zۼljhe@[SBl>h :6nn֊GuHNݱ?QU}Z1٭tc59 [ʇ'vs=|iPc[Wu,\0`t,oF B-|gH8RV6)'Zp?^ V]$Jr67LjZ]RwёO5ʭZ*\͑;Xg>_ocR$WJ:4"ϰGKqd5sBNp$1`8ʤ2lZUJS $,ߟҫ`ބ80IX=^_W0E:JQxj7ۑL>dE \쌹ȹTت(nBB7DMfBs*-a62H#1jm)X!#eV)#$pS֕uJe]Y@c9(${'[B{1`Qb$w#cp"$H@* F,:*EO˶*te {ۧ<+>m> !t>1x}w 7X*C P4ϟ~4[r7/tC.KZ\xCc(bNz!1XPF0 )%XKuZq.%NxxXz--%oWbv1:HxQ촴Pf8&Z  ʝp"Zr-AbqiD:h{H^ٺ7:jHU:Kbot@Di/Bڡ9шFFyN1ZR[+vk 9-5DeZ(ԲS~i}C^uE)*]N!\&2F8ÀF$1\mO%sSb1fq/AX1AOZMQ>oCFUk$?)<#Mv_aNonWV־_7j:sqmh/a,v9os ԢŠ2ԙHA0~qFTqQ"cn(vݙ߁[OWln~:/e5d5_ y>dB7 /aà,`Zm¬q0/f'Vy ՠ+o>z6И0u Bd6Ym9M%I8_gkҌ[BD/.UPEP1 7kXD"SYb1պ8'x$Xc˾"R`;YN*^ubU3}$Ry`T. ӺG^ٚ| (ƒVdچ;%jqvhا8jlgQu//-15Be#׬uE*}Q>hCRVn#u1S60!u T 2u﮴VĄEݕB)G2 'D}87r0 Ned+Ly9uA\!ow9f0ט; 7"+L8w-IVn.d]#YfR3jh=,R$e;#H]uuuUuu  k CFFYFVv^Tc.F544Y|+tDk!X·0t9G#Ü#R5ijGN@q{5cҵLݘ1^ZX<UD!Fjx I1Q\RkE@&t* 0TkZK< 3^_ tlm&Ӽ;RHsMFj)9>wvBR'gmΓwG]$y~)BN)E0dD0c|Ob@G9$( xh(CK'9Ca%.XFZTHCY(Cy,F%3&"p,WF,~O$TSŽ-f8^wz]h:6/g.._ńOfAׯ`~_%S|e|&좟V3KOh^we1#!_6)J%v*Ӵ:b־OKK]kƌVnmH.dJlk7^vkAiF~fɴ[cFj6$ F2]lj7/A5Š4mv{xL5fvkCBpm$Sfޜ0ƹt3jTmn>-eSu3ZU!!_6c B*Ӵ:׷YO3ZU!!_n(SdȱPcڊ-F<^x鹛f6]5RJwel,qpoN&ӴrCk.d(n[YpW5E$YYʶc=Ae*yvxQfH-T6'ºT=H**HQLCX)HE;;KG$7ci^FMg 7/-dlqI퀴F{c/Bbb4)g3(hhfh*[':2WNyX4WVFvf}_.OQ_R&v7].?G씜ĥS_zG~;O_;ɋMց8zuչj{hջ1l͛!o4A_?FPkJKG_4Η"X$w.{!]{nq"B%:oHw5懇_. RU6kIogj2oȂcw첐+]Vs#֮=hahR3Z*(ه*Y)X% oiq9>F= 沊!l-t<[b(SK"jH5%o'+ -H h|J0@C61Ÿ0/?vwtprCGW^-ჃXiDq \V&! ?!R_~~8C,œQGM?i8-;GO0}_/Rq`tHl( 3q_`Gq☋.}Q:wvyŇ|ۤ;a;p̍F aD½R yY!\tDEhWeRɬ̰HiƭVlp;J&I&-dkb-)CaIV緳|6ϣbv b~4s|S$*͟(g~`׼jY21q?ſ{(rإ#WـAm]gjە^ѩ=fܗ,< :}Ɂ:]Dpk%%ZQZq?i tELY1d,I oK(<@kѯ3X.Zg.gZݡ)?``u ծX#lɽ>Y6RxWٴe7i5(Gq: #((2[M#z}"6$\7LaSL( w r םr&+:A?՚pIyb,`<z<9e8ߧ4ô8`3(q0CDoIYieg[itXk&DV+N+ VijF8FER<'E=ݧoШWl`m${CFf8ljC֋xK0gk,z]%qx='U!.<9ѢT_ߖYoo{il\ИY5"_m]/׹v) >.@sllƓ J[Uel( U$CzQ`曋MŽz|-]*5iJ{rpk-:fs"~vI 6UIˋkaaMX@1x3},MXፑv1% 6OuViK,Ȋe#".A79.+mho[ Zt]ȓ堛.NOGwL!1&ͪBjQcs4vd,_m$$j@Aȴ{kKS}%:;dg1&1;tIc`)U>NQrRN \tIlsHCsP(tDtpsq+s}ȃ\fgޗ^Zhm#ClοB7ف~QX.7FQ-y ? Ic P8(yL[uG~AN@TV9d7xNbZnSVth}wg(&Mˇe?e?Vٽ{J׽´HPIbA KFCCU$oORr6~KW9(gHQJ4ӑaB@3HtL BD`nƯ>r$ )c[QQt&@20AYQj)BaD1"h|c RrDQC Nư \2E@acYԥTM,J2{\uڟ#A@0|QiiL [ ^@0 "57Z!+ʇJsHNR(Ž22r:I2 Xg)`\6OP[sF%~H D"gZz-/<;dQ)1$Bc Zy}+x$B <+?v7ci^&GЧ%]/&]gcI+QFɞDVA%ۃܥ7^ƩO}eNK5Aї=duRb D[RHYIJ.ٱl$ O&-,|oZ.{םZ.?e܎Y^}w8~n xT  s^kU8K,/gpqRX@Xqz W\~̳*Pqʞ{P I\;.+ͽ5֥b!@?yYVmp\8 ;Xkz3.Ylr+rV h|c8k`_lCۄtAI~ٹ5tKsogG0]~v0T*䓻5DZI,N̾wnMHEȐjDW X$>Zzs5E'})--! R IOL,y_:UB 5/+#gmJ=3'&|&"O ׌)~>ed9 _PA.26CD}` fw\-F3n IqF `{_(g0Ƃ[$c `$''; Efjo7w]i5/JzaXeiNh)I_3D*LpN7SnLlje)xf$&c3kCEtN1Ac^éW(Ea7`|O9a >L@+folfY_] 8 QepRdYrϒ>Ke)3UH5"?yOsQp6nf)<7qIzXҗ$U 3&/3f&L*h,v:jETWT'[ o{k+/xڀ~rJA)D1;^k$uS*k<S,)ťnt;f"lv/DԠkQc۾i (Ty+EqAvesNm2/x7CUqgJfpmڀzL(G%Ї5D .Q`*|cIɇOl=lwkô֝bY=?=d1&TF!:݌HE.V#:Z0߿PZFm?Ԡ_*cfh&h7[ذL|} rvGLpN`/B"*NAxTu%D2eGMmr?ϴLadS59KM?o5PC:L SBrAݒɱ.y 旚T d؁[\:)Taۓltc0D'㜉A+V+~e6Ʈ&~gm>V,;n*m@?L=^E~N&Ѣ"bTu Œ wΒ4`p2|;:u:o=LA '4-km5OaJP2FҾ(D9Nv`-)kiOi;rETEhXi3^JSQ`. f^[ 1RV@LͶ*%PM\3hP!2>5A tW^GY@ & [>DQ Xѹ,3 BԖb[[i[{Nڃq:~:B9Rtjc52`Fr{SU-<)wz@5 WRؽW l1b3X9!203M@07M,;=IT6O2v/gRk+CH%lcasفMЙ62nPxѫThYa xЎΠxD.\~yPHr8vXu`{F'Jtٹ w|6@H(bb/k/@(b0HH =f;r2F@Iy= 0@+N;eWu1=-=, ?&4cƻ>kB9 #Cߏva#Ӳ|=MF'V5@檣)o6 Z^Gi.q T;`T j{t]~7E泞>aT1&HaMH}6dÏU~xwlG E]4uVXKu)Im:"hD;x|-͚<cAbUU>2+K7fJ ֪J=P=FFBWJN:iF}Kz I:Wѹ u: l@lYCK34ҷ+}&4|zG'5܎G{`{!zdEP-;k%XɕOܰ%.NV׶DYk,5j f긺|SlLWG 0a#0J4X ioC70~oD~N-f!@qY̠Q{u2Ow7dOzJ[eۍ| 9?.YkUK3ן߶^5z6 ᢘ.d.?\_﷖S<@̦J%DsvaZ=X`]:?.¤`3_>KL`^ En$u룈HF}{@V7YaO]c}Fh&E \z{c\SP\H ,0a FUQ.bT B=| `yav=hѐ%Y–M\Yr< ,U=*Y̓A2OCktur l8F4TsYy1`Dz#Fg!tCv|}{<嬀F>݇d\ JUy>1y>Z`R-i뚶0`'iRmz<B2 9.S׃䋉3Nv&M^7a$_er&SkO!_e'H酼vs;NpYEQ m ڣ4,CՔo\(!t6ۣ ٸֲ_[ocłLj ĥ2C8 WNm;!FOeBއ~:[ko72")HFXj}D(B!58 #? QGӑP|KumshV^$/~8񠛎QvX)t4#28vew&5))Ev'WGcD^J !0 I%=60kJc x` tm_ۇ(qױ!5` aRr TXtڎKM/EZ_A݃I@mܧ 岵FAf ́H!$i0X^K6 y bBSI,uOlpLQ:& "Bazbn*0f*:ڻʬ'h:O`P 'Ix 6@AKNʄ.iN5l(ҤD:Ѿ5&?Ø@+4@pf9DD )u ˤ\ aScS˶\b ;s CT,34+&2֙9ڮƽܸ/Oҽ2d nXcMꬾ/cDK ak NbIZhY9_*5~~=hjJO uͼŲV)C-[IOOb)IȱՂ)_,w>Egk~O2Һg>=Ke(tU"|>W%^x#}92Y*A^Z~fdP"Uicˁ-srH[}b-ldJʳ2H*l`=:u3.l33gwN+:)Ze9{&ksWkܳV,{T Ԩs]zWUZdE;%YX,;mG3#ߨ9G[e*m݀-ќmm-@-Eux6Uܮݭ ik+ָvXc5> v>5άtC;\99} [m簾m~2IQ׌I!3'nJiTXX*DYK*OoJkX|{rMrmIVat~~H+kY7eܘL3KZ4mCe @6ֿb Dpו5֨[no9@0MƮIU@ȋ=/|xgBz2(dJHȔo迗΍1q8IN4y6wXS,5z<&}HN6|G:dEHhqHiSW{TiJν<5* h5AS& REG597I\NE+< X"1'1%ڧRҘaX*E$C9%ǁ>(`!~<&0BMH@{yNrԘˑBY16np3]*=dj"-*.B8U$=,BXW!>1((,'CP)z4/ *$:Ai11P<㰘gF 2Z^x:l.vEH6'1R/oV^IBJx ^=C{fz~H9ZGD D-|XKK]  W=鏃eg?.$|y.l+Ew }ˌ7\Q`LA[%J ]ZʜW`I=᫣`R%70ׅ9GúaIM=0*lRe\㯜&$WHƧ]E7?cTTycȲO cX1[hVf hC`\8ody*lwo'(猣DbmZ 3Z,c:%4I#cͼqtE zYk>cìt5L7ϳ IF6lǵk]#0~i-T$+n;%09W=% |1Ϯ,fhkR5aSr}7" ]S!i,P4 ^* {7?[f+IM܁90MRPcFhtFxč"> Sd>E++R5eKH򆜉7×R``ϼx>|7?DrjߖR]Xk+$XC>R7:b#yh{4~LBh &xAHBؓUtࡏPQ܀9VNLh(\k%q4+ܱ*㫛s堗ssSn@NX<&kEbG|َq#wc'qk(}Pb3S151"IJ[v3whuPt&O#^{ecȥ<9ʼ غ?$.Nܕ蓻iRω7yY>Ƞ:֬`5>;|I-byxK[3eEc;n,A 5?;S{3w۱iM9efxR[[bD _LtrA'7ofr]qELvN"B̍|]1z!QWb@Ł&L~)!117Hi?{WǑ ŠȼzX} {rb$qJ;w:mun A-{b׭f%1/'?̸p^)XjWz&wCRqNPO~` 8Kxo)L&Hm)}J(a|*,%dKΰD]fUYn)CIby4" 5cR:I\m+DCC%˴Ʒ`2aYjjs.^xum.8IWVB(H"/)KmwQ')DoֹA9;'f&N!]^@Ykxǫ^:{6.pӳ}+>Vgj]d.jߞyd! s9[Jyxz*߳E?5ISE.>&|mMufJ}ojo[ lʽ1Iͪy A%z-uAӛse}7xel~Xvd>OڛО!#S"P 2p.)#N,|j!uO5YA&TAj/^$HR{QzS45:<Ș$cBDZ%tG9n*S2/,֜ \m%ۉE u'`BZ&aR+NKJôbjdSȍ6ٸSѡ.⌥,xeRQJB}3"|m^ɵļkӀ̖ka-3k%:io!DD Nl2c5ĔPY<:Gg\4x}?sz~C4snRoU~bęWӸ3!R|2-'x{FWesTQT&iڿ\܆ةU{3`D{P Jx4 G"d^m\"魦)dOݢ pZM/p4['g;nE/V*(`jj[zhs)aB,*-Uևrxћey.xY溓x#*/U1kQJ5zk^fdt<ЭۘL/3rcgf]^YL\ f2sR3lp$h]F`>{v!1e1A9cImHhˉ*xg ˲iF;Ȳi3%JjCZ;u>t2 mZ5$ѭ.?pI8;X1{>++ ew&H2OKdUȑn|]݀˔ _uR=XI7Bܹ":<L.$7*П?@æZ+z~ ɧ9& N4 f9fvtw'#V7`//RRemdQ+EOM&xO8 dϱI49>(*m~(bOsXπΜm^*Ht*ߣ~)jvj-cϛÓ" *5qa1}ڴ:^r ?t6ƀJʨ`(eXxy> 7 +o A|,B#I*c(WIfbW1V $H a` Q3B&2.u6G1(3l08OL4Xa6d5*͑bIL^úG]# *aeS !0Hi-.^(g[Q `G٭hDzǮWG[h6 O㕊?8발_`80*H+,G)naL-CrR䤜72*?g L#Jҡ E>8uPI1uJTs,^J ,4tkw⃺d(APKUEMs`|QJ-((:e\YB[x#L`>ڶ)fC.%0Wp+k}(Vغܒ,^b0 dz~oZ}8)a-1%1*F@("F-W# H۶( }]fXWy@"jX`&ҔGI>"RϤѝ>"ǚn}\)s:OJ@@†,ͷ4ta ?Ri}~T:O-a塋-ͿolT x<5A.p{*ٍ*yTYRp'{zٓFɉs^ ڼ ȰYe|sRaR֍0ORMk1et71 ixDŽ nΒy <4&Ksnxǔ4x?SNt3YN^qGwMs:),qA,x)AJ;mwL=Kۿh84bSh4/bSh4kpSѧWT_ 7C["|n%n5x ǭv&6[yne &cdGWʎnf 1<(8lj gӉnowu Ex*MÖՙ3BuUscc=C:IaK:fP 6cؽ6(hjRh+םj)}8謠Kmu{#Υ G/Q܂dHa\" 3&[.DTy3BAhα$9Vp DkOD^(P#vX W$  Wj1PWP QE,]UOK#9U RGF#X.(ӭnb&ZG&3؀8!.` Jem̉ ,H91BZ0D1Zdm:N]ubڶTt? > UAx !>pQ@;A> pLH0"!`aF|3` an%N ̺&ڀHj 2DyCgw[ 0LO,>l1i+sͲ[xT\;թX\(Sx yFH1:+đT76Px_TCQGOIxHO>~7rYO?f?g J$՚j /Z&?ds A P\W+7ktM%IHJ6T7%+ѱ x GT"BVd`b0[RF0 @$d!PbIS:C `E:ns)2IÁ RX0<" ڙ/`~7']/ևt"E?̂G ݧɩ W'eœo]2y0yyM> L-|&&*&!tK:Gm"ކX>nZKw9!MvԈ{ tnd%R0"^Hh| 2\wEb[LRa q< D~$w)*)95JQm"3Ȟ\L檞q}KÇBKr{Z? rXg/͍\1qeif?d>|BŔ4ꖏIlݱ[dp  y1Q}$Fc騥[csyh_( vaA4`|oUkr/ z!֤NZ69kE>>Ւ>N@lo6ꯚF)z:/F֧zi?2~NR>:R +u‹ߖXUv?eeeeծ'Bș%BCbQe$N#9W8DC 1XV2/j;YsGr͕sW MK=LCC&^\CLFc Jŏ̤W0J lK{ 5.X밒G)d0Vp6A$jFƳ'PG-*syUTeJ5< x"H*ɼꦦn_`I:'7,gӵ'г}є=)_cu=(9:C> B * `gιT-\4`.6 yz > Si Q`"9CLl{IQbsq;N#Dj8^(ct`((ca斮PXJO3n @QjA]23A"+N;V1 %>f|/ET[I `,8 r&!zjIFp$ v#Ğ9nKHʑ6X`Г5R ͼAL1`KÔe[ _a9ZGOY/d8CMH1YSsH0dH-8 =p)AҖ!+Z nE$ &RWA&e2u ݷtYJ~-N>}u"J)?IX(#e%A4.fʏ~_ bDO痟t/Ɠ:> uDb(r۝2 EMjs`Ɠ\#i<K(K㨇(`t;739t7£[Jd~է1xR@0)g?^//UwwI,:0X:ܑBa:kvظW#*tjKt9VHN:u `/>/>/>/#9%g#z/J9XYY0$cM_aӗ/g(/ Ű65Iq%ԑ#oU: J#8^XE14/^^s'"{.Q}M<FDӑK wS1Do 4<aMcKC\ qHj b5 bB}jKi*]ނa8RBi VqB8Is%Y:*3=$Z):l]Ld8;[m=bv2Ix)hK]:&48AFi )粊: k,%o>4%\Ee\Q0ڨ v#%njH : :o ŸIhjS )n":OL9, Do1>j`%vZXO#Jب!G۫Jv}_LV| ͵r-U~g*LUS+`g0A4L!WCe[p ,B)z\gJ+,AfB  qim.~K-޶]b!^!8l{S +cȸ\_tJ]v.U0շ@)'Tl 0 o ?GW2RfejZf-I:j5 +m7an2~[wyE[+hA½:\!TMhM^3gSy}tl"0οF*:ӧ}|sw6Ibf+>L;NXn!VpK߆LGn"6O5cu)ǂ>Whиh)٩)l!g{T!sH[4%D]:,e> >ހ@_0*0xUS0)'b0Bڰ2-{)zQ|B\z!Xr]hݏ= %^vl"|6#4"_Nq IwU;?*%. *>D' F*ᨀ;=q`m.g_lзgW%R_IϽFss7\tXR_M&gg9Du65w!LAgn;(weCx7NPFq {]42(Jb8HZ\tkEjI 'GEM#1(*7X'$Iu'ψ)fP͢.X. l24OnA9N}5`)e ,uv#RGeb^ӚV]}ʍMiu Y JHF51zi:D#žC("jydrg_T*Q;]k0Hŝ )b-ȼY6Oh'ு8Gec~y:M pҔ&j,ak2JhĢ6iSfғE(#glz_Rq ïBԾ[V+1qw,a_ g!$OB["gh|{..jsL'gQc ZZI^2*0E*FQ1eI`EQKwn0c~ )qtDEQ ERc'J3㱓/ZǏdqHZ&Bp mXoSKSDnޝ#STf;YT$SbT+P7KftzNZs?Nb]f7,/ M#vo_!,\hק.}C1ڥhW36v1QC˽!nh],#.C81fW.c"\5fauV5nm鴿)NT~ 1 nϏPR{WYd0wܠ#Ck)LH1mH!s :ɷAsJu"Gt$x6>DF|8/}!U}7h4HE@u[͡U%NJĔJR B0b Y$ޘp)8$E#ǀRA%g&zF9bP618 +&VjslB*+!Iŵ^Qu\Tк-CfeؑQev.;3f, lCNtd/qhm5̓*ȣH5Dy% j :~n YzQnۊyѢ>B6E}c!)`n>NP׾5z3֨' E9@-qHt_-asw#ۦmgy %$8}rNj(6ugM(xPkYҴrvra,o ݐL}sԻBr"v#/O!/]Cp/j&PZxse?\b|2&"_yYۺUpiSn셼w[A;-ME`>jD^ 'ET8AR map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": dial tcp 192.168.126.11:10357: connect: connection refused Feb 26 08:45:03 crc kubenswrapper[4925]: body: Feb 26 08:45:03 crc kubenswrapper[4925]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:30.816543339 +0000 UTC m=+2.956117622,LastTimestamp:2026-02-26 08:44:30.816543339 +0000 UTC m=+2.956117622,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 08:45:03 crc kubenswrapper[4925]: > Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.474824 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897bf711619d90c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": dial tcp 192.168.126.11:10357: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:30.8165819 +0000 UTC m=+2.956156183,LastTimestamp:2026-02-26 08:44:30.8165819 +0000 UTC m=+2.956156183,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.478875 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897bf711f19c3ea openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:30.967571434 +0000 UTC m=+3.107145717,LastTimestamp:2026-02-26 08:44:30.967571434 +0000 UTC m=+3.107145717,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.483027 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897bf711f3383ca openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:30.969258954 +0000 UTC m=+3.108833237,LastTimestamp:2026-02-26 08:44:30.969258954 +0000 UTC m=+3.108833237,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.486943 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897bf71202c1b77 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:30.985550711 +0000 UTC m=+3.125124994,LastTimestamp:2026-02-26 08:44:30.985550711 +0000 UTC m=+3.125124994,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.490436 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897bf71203ce151 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:30.986649937 +0000 UTC m=+3.126224240,LastTimestamp:2026-02-26 08:44:30.986649937 +0000 UTC m=+3.126224240,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.494792 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897bf71204208af openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:30.986987695 +0000 UTC m=+3.126561978,LastTimestamp:2026-02-26 08:44:30.986987695 +0000 UTC m=+3.126561978,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.500191 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897bf71204d640e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:30.987731982 +0000 UTC m=+3.127306255,LastTimestamp:2026-02-26 08:44:30.987731982 +0000 UTC m=+3.127306255,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.503923 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897bf712bc624c2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:31.18019501 +0000 UTC m=+3.319769283,LastTimestamp:2026-02-26 08:44:31.18019501 +0000 UTC m=+3.319769283,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.509076 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897bf712be52b04 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:31.182228228 +0000 UTC m=+3.321802521,LastTimestamp:2026-02-26 08:44:31.182228228 +0000 UTC m=+3.321802521,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.514024 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897bf712d223812 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:31.203006482 +0000 UTC m=+3.342580795,LastTimestamp:2026-02-26 08:44:31.203006482 +0000 UTC m=+3.342580795,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.522108 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897bf712d581c91 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:31.206538385 +0000 UTC m=+3.346112668,LastTimestamp:2026-02-26 08:44:31.206538385 +0000 UTC m=+3.346112668,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.529547 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897bf712d6a91ea openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:31.207748074 +0000 UTC m=+3.347322357,LastTimestamp:2026-02-26 08:44:31.207748074 +0000 UTC m=+3.347322357,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.536874 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897bf7136d538e8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:31.365732584 +0000 UTC m=+3.505306867,LastTimestamp:2026-02-26 08:44:31.365732584 +0000 UTC m=+3.505306867,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.543195 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897bf71377fda73 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:31.376915059 +0000 UTC m=+3.516489342,LastTimestamp:2026-02-26 08:44:31.376915059 +0000 UTC m=+3.516489342,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.548860 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897bf71378f32e7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:31.377920743 +0000 UTC m=+3.517495026,LastTimestamp:2026-02-26 08:44:31.377920743 +0000 UTC m=+3.517495026,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.561714 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897bf7140e85311 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:31.534756625 +0000 UTC m=+3.674330898,LastTimestamp:2026-02-26 08:44:31.534756625 +0000 UTC m=+3.674330898,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.568355 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897bf7141b92108 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:31.54844084 +0000 UTC m=+3.688015123,LastTimestamp:2026-02-26 08:44:31.54844084 +0000 UTC m=+3.688015123,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.573130 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897bf7141e9e320 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:31.551636256 +0000 UTC m=+3.691210539,LastTimestamp:2026-02-26 08:44:31.551636256 +0000 UTC m=+3.691210539,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.578703 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897bf714c3ab2c5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:31.724704453 +0000 UTC m=+3.864278736,LastTimestamp:2026-02-26 08:44:31.724704453 +0000 UTC m=+3.864278736,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.582222 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897bf714d3c1278 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:31.741571704 +0000 UTC m=+3.881145987,LastTimestamp:2026-02-26 08:44:31.741571704 +0000 UTC m=+3.881145987,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.583058 4925 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.584257 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897bf717e2dfdb9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:32.562732473 +0000 UTC m=+4.702306796,LastTimestamp:2026-02-26 08:44:32.562732473 +0000 UTC m=+4.702306796,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: I0226 08:45:03.585726 4925 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 08:45:03 crc kubenswrapper[4925]: I0226 08:45:03.587228 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:03 crc kubenswrapper[4925]: I0226 08:45:03.587259 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:03 crc kubenswrapper[4925]: I0226 08:45:03.587271 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:03 crc kubenswrapper[4925]: I0226 08:45:03.587300 4925 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.587346 4925 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897bf71378f32e7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897bf71378f32e7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:31.377920743 +0000 UTC m=+3.517495026,LastTimestamp:2026-02-26 08:44:32.567274251 +0000 UTC m=+4.706848534,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.590781 4925 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.590872 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897bf718acd4052 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:32.774496338 +0000 UTC m=+4.914070641,LastTimestamp:2026-02-26 08:44:32.774496338 +0000 UTC m=+4.914070641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.591968 4925 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897bf7140e85311\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897bf7140e85311 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:31.534756625 +0000 UTC m=+3.674330898,LastTimestamp:2026-02-26 08:44:32.780125741 +0000 UTC m=+4.919700024,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.596009 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897bf718b50357d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:32.783078781 +0000 UTC m=+4.922653064,LastTimestamp:2026-02-26 08:44:32.783078781 +0000 UTC m=+4.922653064,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.600596 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897bf718b5d9034 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:32.783953972 +0000 UTC m=+4.923528265,LastTimestamp:2026-02-26 08:44:32.783953972 +0000 UTC m=+4.923528265,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.606051 4925 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897bf7141b92108\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897bf7141b92108 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:31.54844084 +0000 UTC m=+3.688015123,LastTimestamp:2026-02-26 08:44:32.79481912 +0000 UTC m=+4.934393423,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.611620 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897bf71954992d3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:32.950416083 +0000 UTC m=+5.089990366,LastTimestamp:2026-02-26 08:44:32.950416083 +0000 UTC m=+5.089990366,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.616928 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897bf719633a061 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:32.965754977 +0000 UTC m=+5.105329260,LastTimestamp:2026-02-26 08:44:32.965754977 +0000 UTC m=+5.105329260,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.623814 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897bf7196498eed openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:32.967192301 +0000 UTC m=+5.106766594,LastTimestamp:2026-02-26 08:44:32.967192301 +0000 UTC m=+5.106766594,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.628465 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897bf719fb7d430 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:33.125413936 +0000 UTC m=+5.264988219,LastTimestamp:2026-02-26 08:44:33.125413936 +0000 UTC m=+5.264988219,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.632497 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897bf71a0508a98 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:33.135422104 +0000 UTC m=+5.274996387,LastTimestamp:2026-02-26 08:44:33.135422104 +0000 UTC m=+5.274996387,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.637465 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897bf71a05de621 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:33.136297505 +0000 UTC m=+5.275871788,LastTimestamp:2026-02-26 08:44:33.136297505 +0000 UTC m=+5.275871788,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.643548 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897bf71ab6edb43 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:33.321958211 +0000 UTC m=+5.461532504,LastTimestamp:2026-02-26 08:44:33.321958211 +0000 UTC m=+5.461532504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.648608 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897bf71ac895548 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:33.3404706 +0000 UTC m=+5.480044893,LastTimestamp:2026-02-26 08:44:33.3404706 +0000 UTC m=+5.480044893,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.655468 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897bf71ac9aa601 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:33.341605377 +0000 UTC m=+5.481179700,LastTimestamp:2026-02-26 08:44:33.341605377 +0000 UTC m=+5.481179700,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.661290 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897bf71b8dacddb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:33.547136475 +0000 UTC m=+5.686710748,LastTimestamp:2026-02-26 08:44:33.547136475 +0000 UTC m=+5.686710748,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.668839 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897bf71b9fbb8e7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:33.566071015 +0000 UTC m=+5.705645338,LastTimestamp:2026-02-26 08:44:33.566071015 +0000 UTC m=+5.705645338,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.677246 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 26 08:45:03 crc kubenswrapper[4925]: &Event{ObjectMeta:{kube-apiserver-crc.1897bf73cffcda3d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Feb 26 08:45:03 crc kubenswrapper[4925]: body: Feb 26 08:45:03 crc kubenswrapper[4925]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:42.525178429 +0000 UTC m=+14.664752712,LastTimestamp:2026-02-26 08:44:42.525178429 +0000 UTC m=+14.664752712,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 08:45:03 crc kubenswrapper[4925]: > Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.681769 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897bf73cffd575c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:42.52521046 +0000 UTC m=+14.664784743,LastTimestamp:2026-02-26 08:44:42.52521046 +0000 UTC m=+14.664784743,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.685751 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 26 08:45:03 crc kubenswrapper[4925]: &Event{ObjectMeta:{kube-apiserver-crc.1897bf73f696dcbe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 26 08:45:03 crc kubenswrapper[4925]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 26 08:45:03 crc kubenswrapper[4925]: Feb 26 08:45:03 crc kubenswrapper[4925]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:43.172805822 +0000 UTC m=+15.312380185,LastTimestamp:2026-02-26 08:44:43.172805822 +0000 UTC m=+15.312380185,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 08:45:03 crc kubenswrapper[4925]: > Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.690938 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897bf73f69b0183 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:43.173077379 +0000 UTC m=+15.312651712,LastTimestamp:2026-02-26 08:44:43.173077379 +0000 UTC m=+15.312651712,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.696082 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 08:45:03 crc kubenswrapper[4925]: &Event{ObjectMeta:{kube-controller-manager-crc.1897bf741cf850a4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 26 08:45:03 crc kubenswrapper[4925]: body: Feb 26 08:45:03 crc kubenswrapper[4925]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:43.816726692 +0000 UTC m=+15.956300975,LastTimestamp:2026-02-26 08:44:43.816726692 +0000 UTC m=+15.956300975,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 08:45:03 crc kubenswrapper[4925]: > Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.700316 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897bf741cf8fa7e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:43.816770174 +0000 UTC m=+15.956344447,LastTimestamp:2026-02-26 08:44:43.816770174 +0000 UTC m=+15.956344447,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.706363 4925 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897bf741cf850a4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 08:45:03 crc kubenswrapper[4925]: &Event{ObjectMeta:{kube-controller-manager-crc.1897bf741cf850a4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 26 08:45:03 crc kubenswrapper[4925]: body: Feb 26 08:45:03 crc kubenswrapper[4925]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:43.816726692 +0000 UTC m=+15.956300975,LastTimestamp:2026-02-26 08:44:53.817459406 +0000 UTC m=+25.957033729,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 08:45:03 crc kubenswrapper[4925]: > Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.712279 4925 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897bf741cf8fa7e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897bf741cf8fa7e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:43.816770174 +0000 UTC m=+15.956344447,LastTimestamp:2026-02-26 08:44:53.817560268 +0000 UTC m=+25.957134591,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.718163 4925 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897bf76713c042a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:53.820384298 +0000 UTC m=+25.959958641,LastTimestamp:2026-02-26 08:44:53.820384298 +0000 UTC m=+25.959958641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.723003 4925 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897bf70cccb882e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897bf70cccb882e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:29.586712622 +0000 UTC m=+1.726286895,LastTimestamp:2026-02-26 08:44:53.946102786 +0000 UTC m=+26.085677069,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.728700 4925 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897bf70dfe6c66f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897bf70dfe6c66f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:29.907265135 +0000 UTC m=+2.046839458,LastTimestamp:2026-02-26 08:44:54.161008056 +0000 UTC m=+26.300582349,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.733654 4925 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897bf70e068938b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897bf70e068938b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:29.915771787 +0000 UTC m=+2.055346090,LastTimestamp:2026-02-26 08:44:54.17087554 +0000 UTC m=+26.310449863,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: I0226 08:45:03.817408 4925 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 08:45:03 crc kubenswrapper[4925]: I0226 08:45:03.817552 4925 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.824768 4925 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897bf741cf850a4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 08:45:03 crc kubenswrapper[4925]: &Event{ObjectMeta:{kube-controller-manager-crc.1897bf741cf850a4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 26 08:45:03 crc kubenswrapper[4925]: body: Feb 26 08:45:03 crc kubenswrapper[4925]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:43.816726692 +0000 UTC m=+15.956300975,LastTimestamp:2026-02-26 08:45:03.817495373 +0000 UTC m=+35.957069656,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 08:45:03 crc kubenswrapper[4925]: > Feb 26 08:45:03 crc kubenswrapper[4925]: E0226 08:45:03.828862 4925 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897bf741cf8fa7e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897bf741cf8fa7e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:44:43.816770174 +0000 UTC m=+15.956344447,LastTimestamp:2026-02-26 08:45:03.817675477 +0000 UTC m=+35.957249760,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:45:03 crc kubenswrapper[4925]: I0226 08:45:03.837676 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 08:45:03 crc kubenswrapper[4925]: I0226 08:45:03.837957 4925 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 08:45:03 crc kubenswrapper[4925]: I0226 08:45:03.839496 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:03 crc kubenswrapper[4925]: I0226 08:45:03.839600 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:03 crc kubenswrapper[4925]: I0226 08:45:03.839617 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:04 crc kubenswrapper[4925]: I0226 08:45:04.458305 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:04 crc kubenswrapper[4925]: I0226 08:45:04.507127 4925 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 08:45:04 crc kubenswrapper[4925]: I0226 08:45:04.508452 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:04 crc kubenswrapper[4925]: I0226 08:45:04.508506 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:04 crc kubenswrapper[4925]: I0226 08:45:04.508518 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:04 crc kubenswrapper[4925]: I0226 08:45:04.509122 4925 scope.go:117] "RemoveContainer" containerID="e2783705ffea7f84f1c417d753683ab8cd68e99fa30f3bd05391185fd27a39c2" Feb 26 08:45:05 crc kubenswrapper[4925]: I0226 08:45:05.457019 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:05 crc kubenswrapper[4925]: I0226 08:45:05.672896 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 26 08:45:05 crc kubenswrapper[4925]: I0226 08:45:05.673747 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 26 08:45:05 crc kubenswrapper[4925]: I0226 08:45:05.676651 4925 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8d6d215523be6e52c9479f608fe2baa4f683407f8fd62dbd634a22e692a38e34" exitCode=255 Feb 26 08:45:05 crc kubenswrapper[4925]: I0226 08:45:05.676719 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8d6d215523be6e52c9479f608fe2baa4f683407f8fd62dbd634a22e692a38e34"} Feb 26 08:45:05 crc kubenswrapper[4925]: I0226 08:45:05.676819 4925 scope.go:117] "RemoveContainer" containerID="e2783705ffea7f84f1c417d753683ab8cd68e99fa30f3bd05391185fd27a39c2" Feb 26 08:45:05 crc kubenswrapper[4925]: I0226 08:45:05.676926 4925 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 08:45:05 crc kubenswrapper[4925]: I0226 08:45:05.678184 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:05 crc kubenswrapper[4925]: I0226 08:45:05.678238 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:05 crc kubenswrapper[4925]: I0226 08:45:05.678256 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:05 crc kubenswrapper[4925]: I0226 08:45:05.678986 4925 scope.go:117] "RemoveContainer" containerID="8d6d215523be6e52c9479f608fe2baa4f683407f8fd62dbd634a22e692a38e34" Feb 26 08:45:05 crc kubenswrapper[4925]: E0226 08:45:05.679293 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 08:45:06 crc kubenswrapper[4925]: W0226 08:45:06.335320 4925 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:06 crc kubenswrapper[4925]: E0226 08:45:06.335389 4925 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 26 08:45:06 crc kubenswrapper[4925]: W0226 08:45:06.418127 4925 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 26 08:45:06 crc kubenswrapper[4925]: E0226 08:45:06.418199 4925 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 26 08:45:06 crc kubenswrapper[4925]: I0226 08:45:06.453497 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:06 crc kubenswrapper[4925]: I0226 08:45:06.682499 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 26 08:45:07 crc kubenswrapper[4925]: I0226 08:45:07.459459 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:08 crc kubenswrapper[4925]: I0226 08:45:08.330990 4925 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 26 08:45:08 crc kubenswrapper[4925]: I0226 08:45:08.346055 4925 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 26 08:45:08 crc kubenswrapper[4925]: I0226 08:45:08.458823 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:08 crc kubenswrapper[4925]: E0226 08:45:08.603406 4925 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 08:45:08 crc kubenswrapper[4925]: I0226 08:45:08.659562 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:45:08 crc kubenswrapper[4925]: I0226 08:45:08.659847 4925 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 08:45:08 crc kubenswrapper[4925]: I0226 08:45:08.661454 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:08 crc kubenswrapper[4925]: I0226 08:45:08.661591 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:08 crc kubenswrapper[4925]: I0226 08:45:08.661630 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:08 crc kubenswrapper[4925]: I0226 08:45:08.662585 4925 scope.go:117] "RemoveContainer" containerID="8d6d215523be6e52c9479f608fe2baa4f683407f8fd62dbd634a22e692a38e34" Feb 26 08:45:08 crc kubenswrapper[4925]: E0226 08:45:08.662929 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 08:45:09 crc kubenswrapper[4925]: I0226 08:45:09.460417 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:10 crc kubenswrapper[4925]: I0226 08:45:10.457995 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:10 crc kubenswrapper[4925]: I0226 08:45:10.591027 4925 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 08:45:10 crc kubenswrapper[4925]: E0226 08:45:10.591312 4925 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 08:45:10 crc kubenswrapper[4925]: I0226 08:45:10.592627 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:10 crc kubenswrapper[4925]: I0226 08:45:10.592684 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:10 crc kubenswrapper[4925]: I0226 08:45:10.592705 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:10 crc kubenswrapper[4925]: I0226 08:45:10.592754 4925 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 08:45:10 crc kubenswrapper[4925]: E0226 08:45:10.599699 4925 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 08:45:11 crc kubenswrapper[4925]: I0226 08:45:11.458113 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:11 crc kubenswrapper[4925]: I0226 08:45:11.617371 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 08:45:11 crc kubenswrapper[4925]: I0226 08:45:11.617611 4925 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 08:45:11 crc kubenswrapper[4925]: I0226 08:45:11.619191 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:11 crc kubenswrapper[4925]: I0226 08:45:11.619262 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:11 crc kubenswrapper[4925]: I0226 08:45:11.619287 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:11 crc kubenswrapper[4925]: I0226 08:45:11.624380 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 08:45:11 crc kubenswrapper[4925]: I0226 08:45:11.699516 4925 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 08:45:11 crc kubenswrapper[4925]: I0226 08:45:11.701099 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:11 crc kubenswrapper[4925]: I0226 08:45:11.701160 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:11 crc kubenswrapper[4925]: I0226 08:45:11.701179 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:12 crc kubenswrapper[4925]: I0226 08:45:12.459815 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:12 crc kubenswrapper[4925]: I0226 08:45:12.523967 4925 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:45:12 crc kubenswrapper[4925]: I0226 08:45:12.524254 4925 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 08:45:12 crc kubenswrapper[4925]: I0226 08:45:12.525891 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:12 crc kubenswrapper[4925]: I0226 08:45:12.525924 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:12 crc kubenswrapper[4925]: I0226 08:45:12.525932 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:12 crc kubenswrapper[4925]: I0226 08:45:12.526438 4925 scope.go:117] "RemoveContainer" containerID="8d6d215523be6e52c9479f608fe2baa4f683407f8fd62dbd634a22e692a38e34" Feb 26 08:45:12 crc kubenswrapper[4925]: E0226 08:45:12.526620 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 08:45:13 crc kubenswrapper[4925]: I0226 08:45:13.457801 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:13 crc kubenswrapper[4925]: W0226 08:45:13.741098 4925 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Feb 26 08:45:13 crc kubenswrapper[4925]: E0226 08:45:13.741182 4925 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 26 08:45:14 crc kubenswrapper[4925]: I0226 08:45:14.461000 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:15 crc kubenswrapper[4925]: W0226 08:45:15.235394 4925 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 26 08:45:15 crc kubenswrapper[4925]: E0226 08:45:15.235481 4925 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 26 08:45:15 crc kubenswrapper[4925]: I0226 08:45:15.463002 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:16 crc kubenswrapper[4925]: I0226 08:45:16.457300 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:17 crc kubenswrapper[4925]: I0226 08:45:17.458780 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:17 crc kubenswrapper[4925]: E0226 08:45:17.599316 4925 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 08:45:17 crc kubenswrapper[4925]: I0226 08:45:17.600198 4925 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 08:45:17 crc kubenswrapper[4925]: I0226 08:45:17.601727 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:17 crc kubenswrapper[4925]: I0226 08:45:17.601773 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:17 crc kubenswrapper[4925]: I0226 08:45:17.601786 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:17 crc kubenswrapper[4925]: I0226 08:45:17.601824 4925 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 08:45:17 crc kubenswrapper[4925]: E0226 08:45:17.606880 4925 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 08:45:18 crc kubenswrapper[4925]: I0226 08:45:18.457605 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:18 crc kubenswrapper[4925]: E0226 08:45:18.603752 4925 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 08:45:19 crc kubenswrapper[4925]: I0226 08:45:19.458947 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:20 crc kubenswrapper[4925]: I0226 08:45:20.459078 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:20 crc kubenswrapper[4925]: I0226 08:45:20.957839 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 08:45:20 crc kubenswrapper[4925]: I0226 08:45:20.958027 4925 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 08:45:20 crc kubenswrapper[4925]: I0226 08:45:20.959473 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:20 crc kubenswrapper[4925]: I0226 08:45:20.959552 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:20 crc kubenswrapper[4925]: I0226 08:45:20.959573 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:21 crc kubenswrapper[4925]: I0226 08:45:21.456285 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:22 crc kubenswrapper[4925]: I0226 08:45:22.460498 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:23 crc kubenswrapper[4925]: I0226 08:45:23.459713 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:24 crc kubenswrapper[4925]: I0226 08:45:24.458963 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:24 crc kubenswrapper[4925]: I0226 08:45:24.506266 4925 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 08:45:24 crc kubenswrapper[4925]: I0226 08:45:24.508964 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:24 crc kubenswrapper[4925]: I0226 08:45:24.509180 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:24 crc kubenswrapper[4925]: I0226 08:45:24.509318 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:24 crc kubenswrapper[4925]: I0226 08:45:24.510512 4925 scope.go:117] "RemoveContainer" containerID="8d6d215523be6e52c9479f608fe2baa4f683407f8fd62dbd634a22e692a38e34" Feb 26 08:45:24 crc kubenswrapper[4925]: E0226 08:45:24.511078 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 08:45:24 crc kubenswrapper[4925]: I0226 08:45:24.607924 4925 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 08:45:24 crc kubenswrapper[4925]: E0226 08:45:24.608679 4925 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 08:45:24 crc kubenswrapper[4925]: I0226 08:45:24.610943 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:24 crc kubenswrapper[4925]: I0226 08:45:24.611012 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:24 crc kubenswrapper[4925]: I0226 08:45:24.611032 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:24 crc kubenswrapper[4925]: I0226 08:45:24.611073 4925 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 08:45:24 crc kubenswrapper[4925]: E0226 08:45:24.620007 4925 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 08:45:25 crc kubenswrapper[4925]: I0226 08:45:25.459510 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:26 crc kubenswrapper[4925]: I0226 08:45:26.456719 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:27 crc kubenswrapper[4925]: I0226 08:45:27.458626 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:28 crc kubenswrapper[4925]: I0226 08:45:28.460036 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:28 crc kubenswrapper[4925]: E0226 08:45:28.605062 4925 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 08:45:29 crc kubenswrapper[4925]: I0226 08:45:29.459810 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:30 crc kubenswrapper[4925]: I0226 08:45:30.462175 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:31 crc kubenswrapper[4925]: I0226 08:45:31.456454 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:31 crc kubenswrapper[4925]: E0226 08:45:31.615924 4925 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 08:45:31 crc kubenswrapper[4925]: I0226 08:45:31.623151 4925 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 08:45:31 crc kubenswrapper[4925]: I0226 08:45:31.624539 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:31 crc kubenswrapper[4925]: I0226 08:45:31.624581 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:31 crc kubenswrapper[4925]: I0226 08:45:31.624592 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:31 crc kubenswrapper[4925]: I0226 08:45:31.624616 4925 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 08:45:31 crc kubenswrapper[4925]: E0226 08:45:31.625897 4925 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 08:45:32 crc kubenswrapper[4925]: I0226 08:45:32.458368 4925 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 08:45:33 crc kubenswrapper[4925]: I0226 08:45:33.063970 4925 csr.go:261] certificate signing request csr-j7gpg is approved, waiting to be issued Feb 26 08:45:33 crc kubenswrapper[4925]: I0226 08:45:33.073932 4925 csr.go:257] certificate signing request csr-j7gpg is issued Feb 26 08:45:33 crc kubenswrapper[4925]: I0226 08:45:33.112563 4925 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 26 08:45:33 crc kubenswrapper[4925]: I0226 08:45:33.319316 4925 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 26 08:45:34 crc kubenswrapper[4925]: I0226 08:45:34.075912 4925 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-29 01:17:02.228829907 +0000 UTC Feb 26 08:45:34 crc kubenswrapper[4925]: I0226 08:45:34.075958 4925 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6616h31m28.152876149s for next certificate rotation Feb 26 08:45:38 crc kubenswrapper[4925]: E0226 08:45:38.605955 4925 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.626847 4925 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.628116 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.628150 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.628160 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.628279 4925 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.639138 4925 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.639617 4925 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 26 08:45:38 crc kubenswrapper[4925]: E0226 08:45:38.639651 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.643992 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.644048 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.644059 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.644076 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.644087 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:38Z","lastTransitionTime":"2026-02-26T08:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:38 crc kubenswrapper[4925]: E0226 08:45:38.658869 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.668123 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.668204 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.668217 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.668240 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.668274 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:38Z","lastTransitionTime":"2026-02-26T08:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:38 crc kubenswrapper[4925]: E0226 08:45:38.678587 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.687867 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.687941 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.687959 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.687982 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.688000 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:38Z","lastTransitionTime":"2026-02-26T08:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:38 crc kubenswrapper[4925]: E0226 08:45:38.700573 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.711834 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.711884 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.711896 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.711915 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:38 crc kubenswrapper[4925]: I0226 08:45:38.711928 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:38Z","lastTransitionTime":"2026-02-26T08:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:38 crc kubenswrapper[4925]: E0226 08:45:38.728365 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 08:45:38 crc kubenswrapper[4925]: E0226 08:45:38.728518 4925 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 08:45:38 crc kubenswrapper[4925]: E0226 08:45:38.728575 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:38 crc kubenswrapper[4925]: E0226 08:45:38.829678 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:38 crc kubenswrapper[4925]: E0226 08:45:38.930081 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:39 crc kubenswrapper[4925]: E0226 08:45:39.031353 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:39 crc kubenswrapper[4925]: E0226 08:45:39.132286 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:39 crc kubenswrapper[4925]: E0226 08:45:39.233293 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:39 crc kubenswrapper[4925]: E0226 08:45:39.333392 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:39 crc kubenswrapper[4925]: E0226 08:45:39.433642 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:39 crc kubenswrapper[4925]: I0226 08:45:39.506878 4925 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 08:45:39 crc kubenswrapper[4925]: I0226 08:45:39.508857 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:39 crc kubenswrapper[4925]: I0226 08:45:39.508915 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:39 crc kubenswrapper[4925]: I0226 08:45:39.508934 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:39 crc kubenswrapper[4925]: I0226 08:45:39.510277 4925 scope.go:117] "RemoveContainer" containerID="8d6d215523be6e52c9479f608fe2baa4f683407f8fd62dbd634a22e692a38e34" Feb 26 08:45:39 crc kubenswrapper[4925]: E0226 08:45:39.534706 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:39 crc kubenswrapper[4925]: E0226 08:45:39.636017 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:39 crc kubenswrapper[4925]: E0226 08:45:39.737160 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:39 crc kubenswrapper[4925]: I0226 08:45:39.775386 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 26 08:45:39 crc kubenswrapper[4925]: I0226 08:45:39.778198 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011"} Feb 26 08:45:39 crc kubenswrapper[4925]: I0226 08:45:39.778374 4925 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 08:45:39 crc kubenswrapper[4925]: I0226 08:45:39.779446 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:39 crc kubenswrapper[4925]: I0226 08:45:39.779500 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:39 crc kubenswrapper[4925]: I0226 08:45:39.779513 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:39 crc kubenswrapper[4925]: E0226 08:45:39.837728 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:39 crc kubenswrapper[4925]: E0226 08:45:39.937947 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:40 crc kubenswrapper[4925]: E0226 08:45:40.038452 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:40 crc kubenswrapper[4925]: E0226 08:45:40.139517 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:40 crc kubenswrapper[4925]: E0226 08:45:40.239908 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:40 crc kubenswrapper[4925]: E0226 08:45:40.340857 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:40 crc kubenswrapper[4925]: E0226 08:45:40.441255 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:40 crc kubenswrapper[4925]: E0226 08:45:40.541573 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:40 crc kubenswrapper[4925]: E0226 08:45:40.642552 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:40 crc kubenswrapper[4925]: E0226 08:45:40.743626 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:40 crc kubenswrapper[4925]: I0226 08:45:40.783133 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 08:45:40 crc kubenswrapper[4925]: I0226 08:45:40.783761 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 26 08:45:40 crc kubenswrapper[4925]: I0226 08:45:40.785620 4925 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011" exitCode=255 Feb 26 08:45:40 crc kubenswrapper[4925]: I0226 08:45:40.785669 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011"} Feb 26 08:45:40 crc kubenswrapper[4925]: I0226 08:45:40.785717 4925 scope.go:117] "RemoveContainer" containerID="8d6d215523be6e52c9479f608fe2baa4f683407f8fd62dbd634a22e692a38e34" Feb 26 08:45:40 crc kubenswrapper[4925]: I0226 08:45:40.785893 4925 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 08:45:40 crc kubenswrapper[4925]: I0226 08:45:40.787129 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:40 crc kubenswrapper[4925]: I0226 08:45:40.787173 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:40 crc kubenswrapper[4925]: I0226 08:45:40.787190 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:40 crc kubenswrapper[4925]: I0226 08:45:40.788102 4925 scope.go:117] "RemoveContainer" containerID="450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011" Feb 26 08:45:40 crc kubenswrapper[4925]: E0226 08:45:40.788365 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 08:45:40 crc kubenswrapper[4925]: E0226 08:45:40.844252 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:40 crc kubenswrapper[4925]: E0226 08:45:40.944818 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:41 crc kubenswrapper[4925]: E0226 08:45:41.045583 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:41 crc kubenswrapper[4925]: E0226 08:45:41.146465 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:41 crc kubenswrapper[4925]: E0226 08:45:41.247414 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:41 crc kubenswrapper[4925]: E0226 08:45:41.347861 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:41 crc kubenswrapper[4925]: E0226 08:45:41.447979 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:41 crc kubenswrapper[4925]: E0226 08:45:41.548151 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:41 crc kubenswrapper[4925]: I0226 08:45:41.560616 4925 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 26 08:45:41 crc kubenswrapper[4925]: E0226 08:45:41.648749 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:41 crc kubenswrapper[4925]: E0226 08:45:41.749440 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:41 crc kubenswrapper[4925]: I0226 08:45:41.790392 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 08:45:41 crc kubenswrapper[4925]: E0226 08:45:41.849874 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:41 crc kubenswrapper[4925]: E0226 08:45:41.950330 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:42 crc kubenswrapper[4925]: E0226 08:45:42.051588 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:42 crc kubenswrapper[4925]: E0226 08:45:42.152198 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:42 crc kubenswrapper[4925]: E0226 08:45:42.253158 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:42 crc kubenswrapper[4925]: E0226 08:45:42.353661 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:42 crc kubenswrapper[4925]: E0226 08:45:42.454607 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:42 crc kubenswrapper[4925]: I0226 08:45:42.524184 4925 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:45:42 crc kubenswrapper[4925]: I0226 08:45:42.524437 4925 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 08:45:42 crc kubenswrapper[4925]: I0226 08:45:42.526087 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:42 crc kubenswrapper[4925]: I0226 08:45:42.526123 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:42 crc kubenswrapper[4925]: I0226 08:45:42.526136 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:42 crc kubenswrapper[4925]: I0226 08:45:42.526872 4925 scope.go:117] "RemoveContainer" containerID="450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011" Feb 26 08:45:42 crc kubenswrapper[4925]: E0226 08:45:42.527105 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 08:45:42 crc kubenswrapper[4925]: E0226 08:45:42.555890 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:42 crc kubenswrapper[4925]: E0226 08:45:42.656093 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:42 crc kubenswrapper[4925]: E0226 08:45:42.756932 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:42 crc kubenswrapper[4925]: E0226 08:45:42.857879 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:42 crc kubenswrapper[4925]: E0226 08:45:42.958751 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:43 crc kubenswrapper[4925]: E0226 08:45:43.059613 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:43 crc kubenswrapper[4925]: E0226 08:45:43.160795 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:43 crc kubenswrapper[4925]: E0226 08:45:43.261787 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:43 crc kubenswrapper[4925]: E0226 08:45:43.362581 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:43 crc kubenswrapper[4925]: E0226 08:45:43.463401 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:43 crc kubenswrapper[4925]: E0226 08:45:43.563834 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:43 crc kubenswrapper[4925]: E0226 08:45:43.664514 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:43 crc kubenswrapper[4925]: E0226 08:45:43.765730 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:43 crc kubenswrapper[4925]: E0226 08:45:43.866849 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:43 crc kubenswrapper[4925]: E0226 08:45:43.967457 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:44 crc kubenswrapper[4925]: E0226 08:45:44.068503 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:44 crc kubenswrapper[4925]: E0226 08:45:44.169718 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:44 crc kubenswrapper[4925]: E0226 08:45:44.270739 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:44 crc kubenswrapper[4925]: E0226 08:45:44.371272 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:44 crc kubenswrapper[4925]: E0226 08:45:44.472139 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:44 crc kubenswrapper[4925]: I0226 08:45:44.506498 4925 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 08:45:44 crc kubenswrapper[4925]: I0226 08:45:44.508190 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:44 crc kubenswrapper[4925]: I0226 08:45:44.508255 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:44 crc kubenswrapper[4925]: I0226 08:45:44.508284 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:44 crc kubenswrapper[4925]: E0226 08:45:44.573167 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:44 crc kubenswrapper[4925]: E0226 08:45:44.674269 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:44 crc kubenswrapper[4925]: E0226 08:45:44.775309 4925 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 08:45:44 crc kubenswrapper[4925]: I0226 08:45:44.792123 4925 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 26 08:45:44 crc kubenswrapper[4925]: I0226 08:45:44.877972 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:44 crc kubenswrapper[4925]: I0226 08:45:44.878019 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:44 crc kubenswrapper[4925]: I0226 08:45:44.878030 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:44 crc kubenswrapper[4925]: I0226 08:45:44.878052 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:44 crc kubenswrapper[4925]: I0226 08:45:44.878064 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:44Z","lastTransitionTime":"2026-02-26T08:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:44 crc kubenswrapper[4925]: I0226 08:45:44.981012 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:44 crc kubenswrapper[4925]: I0226 08:45:44.981054 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:44 crc kubenswrapper[4925]: I0226 08:45:44.981068 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:44 crc kubenswrapper[4925]: I0226 08:45:44.981084 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:44 crc kubenswrapper[4925]: I0226 08:45:44.981097 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:44Z","lastTransitionTime":"2026-02-26T08:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.083597 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.083680 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.083706 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.083741 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.083766 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:45Z","lastTransitionTime":"2026-02-26T08:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.187198 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.187265 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.187287 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.187314 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.187332 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:45Z","lastTransitionTime":"2026-02-26T08:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.289433 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.289480 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.289493 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.289513 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.289548 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:45Z","lastTransitionTime":"2026-02-26T08:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.391939 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.392019 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.392037 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.392063 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.392081 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:45Z","lastTransitionTime":"2026-02-26T08:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.477247 4925 apiserver.go:52] "Watching apiserver" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.488561 4925 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.489281 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj","openshift-ovn-kubernetes/ovnkube-node-pv859","openshift-multus/network-metrics-daemon-96njb","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-image-registry/node-ca-n8ts9","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-machine-config-operator/machine-config-daemon-s6pcl","openshift-multus/multus-fmxxp","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-dns/node-resolver-ztgdx","openshift-multus/multus-additional-cni-plugins-sz9t7"] Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.489888 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.490015 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.490217 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:45:45 crc kubenswrapper[4925]: E0226 08:45:45.490323 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.490599 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 08:45:45 crc kubenswrapper[4925]: E0226 08:45:45.490630 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.490696 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.492642 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n8ts9" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.492689 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.492800 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.492947 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ztgdx" Feb 26 08:45:45 crc kubenswrapper[4925]: E0226 08:45:45.493055 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.493696 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.493887 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.494669 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:45:45 crc kubenswrapper[4925]: E0226 08:45:45.494774 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.494846 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.495644 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.496981 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.497083 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.497102 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.497181 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.497207 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:45Z","lastTransitionTime":"2026-02-26T08:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.498290 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.498795 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.499153 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.499463 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.499495 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.499654 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.499741 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.500174 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.500894 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.501104 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.501686 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.501834 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.501878 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.501906 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.501986 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.502026 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.501876 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.502062 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.502066 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.502112 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.502298 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.502316 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.502376 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.502463 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.502595 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.502719 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.502816 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.502928 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.503576 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.503602 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.506245 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.506421 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.506572 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.506615 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.506755 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.506822 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.506850 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.526272 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.539559 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.549456 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.556029 4925 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.560172 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.572625 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.585608 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588005 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588037 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588057 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588075 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588093 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588109 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588127 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588142 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588158 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588175 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588191 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588208 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588223 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588238 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588254 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588273 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588290 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588307 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588310 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588324 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588339 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588353 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588372 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588390 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588404 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588420 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588437 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588452 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588468 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588482 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588497 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588513 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588549 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588578 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588600 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588622 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588636 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588647 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588666 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588680 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588690 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588727 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588753 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588778 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588800 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588824 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588890 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588919 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588943 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588919 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588966 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588959 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.588990 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589031 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589084 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589173 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589180 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589199 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589229 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589245 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589271 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589317 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589342 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589366 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589410 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589433 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589455 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589477 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589503 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589552 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589577 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589598 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589645 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589667 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589710 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589737 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589758 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589802 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589810 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589823 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589844 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589891 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589912 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589957 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.589980 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590006 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590052 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590073 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590117 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590176 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590192 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590220 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590241 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590286 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590308 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590330 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590360 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590376 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590401 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590440 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590445 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590486 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590553 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590583 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590610 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590635 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590647 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590659 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590708 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590726 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590744 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590761 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590777 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590794 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590810 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590828 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590845 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590864 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590880 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590899 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590917 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590935 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590961 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590983 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591008 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591033 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591059 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591076 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591092 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591108 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591182 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591200 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591215 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591232 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591249 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591266 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591281 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591296 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591312 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591327 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591343 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591359 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591374 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591389 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591404 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591421 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591435 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591451 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591466 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591483 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591500 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591547 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591565 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591582 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591599 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591634 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591651 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591667 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591683 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591732 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591774 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591800 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591831 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591863 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591887 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591910 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591934 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591956 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591979 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592006 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592041 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592076 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592110 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592143 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592175 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592205 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592241 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592280 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592313 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592336 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592359 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592385 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592409 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592432 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592457 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592480 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592503 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592559 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592585 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592608 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592631 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592653 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593120 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593162 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593215 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593245 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593281 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593315 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593349 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593387 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593426 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593464 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593497 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593554 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593590 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593625 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593661 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593694 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593785 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593829 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593874 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htmnv\" (UniqueName: \"kubernetes.io/projected/671c7a2c-75d0-4322-a581-3d1b26bda633-kube-api-access-htmnv\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593912 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/700c9110-ceb1-4a4a-8fea-0fca5904ff5d-host\") pod \"node-ca-n8ts9\" (UID: \"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\") " pod="openshift-image-registry/node-ca-n8ts9" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593946 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rngdj\" (UniqueName: \"kubernetes.io/projected/90c32fd9-f2da-4ba9-8cf9-703b6de14025-kube-api-access-rngdj\") pod \"multus-additional-cni-plugins-sz9t7\" (UID: \"90c32fd9-f2da-4ba9-8cf9-703b6de14025\") " pod="openshift-multus/multus-additional-cni-plugins-sz9t7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.594632 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs\") pod \"network-metrics-daemon-96njb\" (UID: \"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\") " pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.594780 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.594807 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-etc-openvswitch\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.594830 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-host-run-ovn-kubernetes\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.594857 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjl2n\" (UniqueName: \"kubernetes.io/projected/9d171f4c-5b3c-498e-acee-1725e2921586-kube-api-access-kjl2n\") pod \"node-resolver-ztgdx\" (UID: \"9d171f4c-5b3c-498e-acee-1725e2921586\") " pod="openshift-dns/node-resolver-ztgdx" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.594889 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.594914 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-host-slash\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.594940 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-host-var-lib-cni-multus\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.594966 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-hostroot\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.594991 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/03d05327-5621-49e8-8051-9dccd63f3a5b-multus-daemon-config\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.595017 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-host-run-multus-certs\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.595042 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrhqb\" (UniqueName: \"kubernetes.io/projected/03d05327-5621-49e8-8051-9dccd63f3a5b-kube-api-access-jrhqb\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.595067 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.595092 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-run-openvswitch\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.595123 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-node-log\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.595159 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-host-cni-netd\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.595192 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/90c32fd9-f2da-4ba9-8cf9-703b6de14025-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sz9t7\" (UID: \"90c32fd9-f2da-4ba9-8cf9-703b6de14025\") " pod="openshift-multus/multus-additional-cni-plugins-sz9t7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.595227 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.595258 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-host-kubelet\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.595289 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-var-lib-openvswitch\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.595316 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/671c7a2c-75d0-4322-a581-3d1b26bda633-ovnkube-script-lib\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.595341 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-host-run-k8s-cni-cncf-io\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.595365 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bdea74e1-d37c-4aa3-861e-6d4eab3a870a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6hpsj\" (UID: \"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.595385 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-systemd-units\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.595416 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.596797 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/52bdf0f6-9afd-42fe-9a30-7da82bc7f619-proxy-tls\") pod \"machine-config-daemon-s6pcl\" (UID: \"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\") " pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.596843 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/52bdf0f6-9afd-42fe-9a30-7da82bc7f619-mcd-auth-proxy-config\") pod \"machine-config-daemon-s6pcl\" (UID: \"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\") " pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.596950 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp4kb\" (UniqueName: \"kubernetes.io/projected/52bdf0f6-9afd-42fe-9a30-7da82bc7f619-kube-api-access-sp4kb\") pod \"machine-config-daemon-s6pcl\" (UID: \"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\") " pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.596976 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-run-systemd\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597000 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-system-cni-dir\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597024 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-log-socket\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597059 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597087 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9d171f4c-5b3c-498e-acee-1725e2921586-hosts-file\") pod \"node-resolver-ztgdx\" (UID: \"9d171f4c-5b3c-498e-acee-1725e2921586\") " pod="openshift-dns/node-resolver-ztgdx" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597120 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-multus-cni-dir\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597416 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-os-release\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597460 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-host-cni-bin\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597506 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/03d05327-5621-49e8-8051-9dccd63f3a5b-cni-binary-copy\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597550 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-etc-kubernetes\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597573 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/90c32fd9-f2da-4ba9-8cf9-703b6de14025-os-release\") pod \"multus-additional-cni-plugins-sz9t7\" (UID: \"90c32fd9-f2da-4ba9-8cf9-703b6de14025\") " pod="openshift-multus/multus-additional-cni-plugins-sz9t7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597598 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vdxb\" (UniqueName: \"kubernetes.io/projected/bdea74e1-d37c-4aa3-861e-6d4eab3a870a-kube-api-access-7vdxb\") pod \"ovnkube-control-plane-749d76644c-6hpsj\" (UID: \"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597621 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-cnibin\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597646 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7bg6\" (UniqueName: \"kubernetes.io/projected/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-kube-api-access-w7bg6\") pod \"network-metrics-daemon-96njb\" (UID: \"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\") " pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597678 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597700 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-host-var-lib-kubelet\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597724 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/90c32fd9-f2da-4ba9-8cf9-703b6de14025-cni-binary-copy\") pod \"multus-additional-cni-plugins-sz9t7\" (UID: \"90c32fd9-f2da-4ba9-8cf9-703b6de14025\") " pod="openshift-multus/multus-additional-cni-plugins-sz9t7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597748 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bdea74e1-d37c-4aa3-861e-6d4eab3a870a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6hpsj\" (UID: \"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597773 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597796 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/52bdf0f6-9afd-42fe-9a30-7da82bc7f619-rootfs\") pod \"machine-config-daemon-s6pcl\" (UID: \"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\") " pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597825 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598000 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bdea74e1-d37c-4aa3-861e-6d4eab3a870a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6hpsj\" (UID: \"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598168 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598229 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-run-ovn\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598265 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/671c7a2c-75d0-4322-a581-3d1b26bda633-ovnkube-config\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598304 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/671c7a2c-75d0-4322-a581-3d1b26bda633-ovn-node-metrics-cert\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598336 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598378 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwkvp\" (UniqueName: \"kubernetes.io/projected/700c9110-ceb1-4a4a-8fea-0fca5904ff5d-kube-api-access-fwkvp\") pod \"node-ca-n8ts9\" (UID: \"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\") " pod="openshift-image-registry/node-ca-n8ts9" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598409 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-host-run-netns\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598438 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/671c7a2c-75d0-4322-a581-3d1b26bda633-env-overrides\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598477 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598508 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/700c9110-ceb1-4a4a-8fea-0fca5904ff5d-serviceca\") pod \"node-ca-n8ts9\" (UID: \"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\") " pod="openshift-image-registry/node-ca-n8ts9" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598565 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598597 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-host-run-netns\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598627 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-multus-conf-dir\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598664 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/90c32fd9-f2da-4ba9-8cf9-703b6de14025-system-cni-dir\") pod \"multus-additional-cni-plugins-sz9t7\" (UID: \"90c32fd9-f2da-4ba9-8cf9-703b6de14025\") " pod="openshift-multus/multus-additional-cni-plugins-sz9t7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598695 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/90c32fd9-f2da-4ba9-8cf9-703b6de14025-cnibin\") pod \"multus-additional-cni-plugins-sz9t7\" (UID: \"90c32fd9-f2da-4ba9-8cf9-703b6de14025\") " pod="openshift-multus/multus-additional-cni-plugins-sz9t7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598727 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/90c32fd9-f2da-4ba9-8cf9-703b6de14025-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sz9t7\" (UID: \"90c32fd9-f2da-4ba9-8cf9-703b6de14025\") " pod="openshift-multus/multus-additional-cni-plugins-sz9t7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598760 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-multus-socket-dir-parent\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598792 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-host-var-lib-cni-bin\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598868 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598892 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598916 4925 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598935 4925 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598951 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598970 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598990 4925 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.599013 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.599031 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.599047 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.599061 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.599076 4925 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.599091 4925 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.604058 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590741 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.590879 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591074 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591137 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591352 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591364 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591588 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591836 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.591862 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592038 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592455 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592495 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592491 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592413 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.607176 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.607238 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592516 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592762 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592869 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.592954 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593048 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593212 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593376 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593433 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.606255 4925 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.607465 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593438 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.593723 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.594051 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.594080 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.594414 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.594632 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.595144 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.595740 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: E0226 08:45:45.595865 4925 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.595984 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.596100 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.596240 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.596508 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.596513 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.596751 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.596771 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.596844 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597136 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597627 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597744 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.597985 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598079 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598091 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598343 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598264 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598443 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.598791 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.599031 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: E0226 08:45:45.599122 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:45:46.09910331 +0000 UTC m=+78.238677594 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.599265 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.599279 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.599381 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.599711 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.599761 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.599743 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.599801 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.599893 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.600224 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.600315 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.600329 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.600337 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.600431 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.600705 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.601335 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.601346 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.601640 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.601637 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.601853 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.601960 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.602318 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.602302 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.602293 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.602448 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.602454 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.602462 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.602470 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.602484 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.602495 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.602825 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.602830 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.602970 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.603279 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.603314 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.603543 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.603539 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.603578 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.603790 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.608063 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.603924 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.603958 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.604042 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.604159 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.604837 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.604901 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.604991 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.605335 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.605389 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.605435 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.605429 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.605416 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.605495 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.605753 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.605789 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.605864 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.605969 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.607058 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.607894 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.607889 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: E0226 08:45:45.608147 4925 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.608363 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.608416 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.608680 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.608751 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.609219 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.610068 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.607653 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.608056 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.610936 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.610960 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.610989 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.611008 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:45Z","lastTransitionTime":"2026-02-26T08:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:45 crc kubenswrapper[4925]: E0226 08:45:45.611090 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 08:45:46.111038489 +0000 UTC m=+78.250612772 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 08:45:45 crc kubenswrapper[4925]: E0226 08:45:45.611286 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 08:45:46.111268435 +0000 UTC m=+78.250842718 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.611508 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.611637 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.611739 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.611900 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.612070 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.612311 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.612639 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.612750 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: E0226 08:45:45.619670 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 08:45:45 crc kubenswrapper[4925]: E0226 08:45:45.619700 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 08:45:45 crc kubenswrapper[4925]: E0226 08:45:45.619716 4925 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:45:45 crc kubenswrapper[4925]: E0226 08:45:45.619773 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 08:45:46.119761781 +0000 UTC m=+78.259336064 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.620777 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 08:45:45 crc kubenswrapper[4925]: E0226 08:45:45.621369 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 08:45:45 crc kubenswrapper[4925]: E0226 08:45:45.621409 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 08:45:45 crc kubenswrapper[4925]: E0226 08:45:45.621428 4925 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:45:45 crc kubenswrapper[4925]: E0226 08:45:45.621501 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 08:45:46.121478987 +0000 UTC m=+78.261053300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.623334 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.623402 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.624247 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.627743 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.627868 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.628356 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.628925 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.629079 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.629153 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.629458 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.629479 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.629471 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.629684 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.629740 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.630024 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.630604 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.634034 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.634961 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.636243 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.636455 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.636653 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.636803 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.637090 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.637491 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.637692 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.638069 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.638194 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.638079 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.638291 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.638440 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.638628 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.638926 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.639034 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.639189 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.639271 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.639299 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.639731 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.639811 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.639958 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.640148 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.640334 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.640446 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.640468 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.640541 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.640687 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.640902 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.640929 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.641443 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.641467 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.641477 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.641757 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.641759 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.641772 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.642091 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.642184 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.642234 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.642609 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.642779 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.643611 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.641398 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.643656 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.645603 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.654566 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.656614 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.661279 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.661440 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.669167 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.676643 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.684026 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.691328 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.697862 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.700429 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.700495 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/52bdf0f6-9afd-42fe-9a30-7da82bc7f619-rootfs\") pod \"machine-config-daemon-s6pcl\" (UID: \"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\") " pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.700609 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/52bdf0f6-9afd-42fe-9a30-7da82bc7f619-rootfs\") pod \"machine-config-daemon-s6pcl\" (UID: \"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\") " pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.700632 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bdea74e1-d37c-4aa3-861e-6d4eab3a870a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6hpsj\" (UID: \"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.700661 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-run-ovn\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.700686 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/671c7a2c-75d0-4322-a581-3d1b26bda633-ovnkube-config\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.700583 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.700702 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/671c7a2c-75d0-4322-a581-3d1b26bda633-ovn-node-metrics-cert\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.700767 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.700802 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwkvp\" (UniqueName: \"kubernetes.io/projected/700c9110-ceb1-4a4a-8fea-0fca5904ff5d-kube-api-access-fwkvp\") pod \"node-ca-n8ts9\" (UID: \"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\") " pod="openshift-image-registry/node-ca-n8ts9" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.700833 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-host-run-netns\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.700857 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/671c7a2c-75d0-4322-a581-3d1b26bda633-env-overrides\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.700881 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/700c9110-ceb1-4a4a-8fea-0fca5904ff5d-serviceca\") pod \"node-ca-n8ts9\" (UID: \"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\") " pod="openshift-image-registry/node-ca-n8ts9" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.700911 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.700940 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-host-run-netns\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.700965 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-multus-conf-dir\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.700988 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/90c32fd9-f2da-4ba9-8cf9-703b6de14025-system-cni-dir\") pod \"multus-additional-cni-plugins-sz9t7\" (UID: \"90c32fd9-f2da-4ba9-8cf9-703b6de14025\") " pod="openshift-multus/multus-additional-cni-plugins-sz9t7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701011 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/90c32fd9-f2da-4ba9-8cf9-703b6de14025-cnibin\") pod \"multus-additional-cni-plugins-sz9t7\" (UID: \"90c32fd9-f2da-4ba9-8cf9-703b6de14025\") " pod="openshift-multus/multus-additional-cni-plugins-sz9t7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701035 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/90c32fd9-f2da-4ba9-8cf9-703b6de14025-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sz9t7\" (UID: \"90c32fd9-f2da-4ba9-8cf9-703b6de14025\") " pod="openshift-multus/multus-additional-cni-plugins-sz9t7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701064 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-multus-socket-dir-parent\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701094 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-host-var-lib-cni-bin\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701125 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htmnv\" (UniqueName: \"kubernetes.io/projected/671c7a2c-75d0-4322-a581-3d1b26bda633-kube-api-access-htmnv\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701147 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/700c9110-ceb1-4a4a-8fea-0fca5904ff5d-host\") pod \"node-ca-n8ts9\" (UID: \"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\") " pod="openshift-image-registry/node-ca-n8ts9" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701171 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rngdj\" (UniqueName: \"kubernetes.io/projected/90c32fd9-f2da-4ba9-8cf9-703b6de14025-kube-api-access-rngdj\") pod \"multus-additional-cni-plugins-sz9t7\" (UID: \"90c32fd9-f2da-4ba9-8cf9-703b6de14025\") " pod="openshift-multus/multus-additional-cni-plugins-sz9t7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701195 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs\") pod \"network-metrics-daemon-96njb\" (UID: \"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\") " pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701238 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-etc-openvswitch\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701246 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-multus-conf-dir\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701261 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-host-run-ovn-kubernetes\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701287 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjl2n\" (UniqueName: \"kubernetes.io/projected/9d171f4c-5b3c-498e-acee-1725e2921586-kube-api-access-kjl2n\") pod \"node-resolver-ztgdx\" (UID: \"9d171f4c-5b3c-498e-acee-1725e2921586\") " pod="openshift-dns/node-resolver-ztgdx" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701314 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-host-slash\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701338 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-host-var-lib-cni-multus\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701363 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-hostroot\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701386 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/03d05327-5621-49e8-8051-9dccd63f3a5b-multus-daemon-config\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701382 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-host-var-lib-cni-bin\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701444 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-host-run-multus-certs\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701411 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-host-run-multus-certs\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701476 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-host-run-ovn-kubernetes\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701503 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrhqb\" (UniqueName: \"kubernetes.io/projected/03d05327-5621-49e8-8051-9dccd63f3a5b-kube-api-access-jrhqb\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701570 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-run-openvswitch\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701607 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-node-log\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701645 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-host-cni-netd\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701689 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-host-slash\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701695 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/90c32fd9-f2da-4ba9-8cf9-703b6de14025-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sz9t7\" (UID: \"90c32fd9-f2da-4ba9-8cf9-703b6de14025\") " pod="openshift-multus/multus-additional-cni-plugins-sz9t7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701729 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-host-var-lib-cni-multus\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701730 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-host-kubelet\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701769 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-var-lib-openvswitch\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701772 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-host-kubelet\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701792 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/671c7a2c-75d0-4322-a581-3d1b26bda633-ovnkube-script-lib\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701773 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-multus-socket-dir-parent\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: E0226 08:45:45.701814 4925 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701846 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-host-run-k8s-cni-cncf-io\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: E0226 08:45:45.701861 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs podName:c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc nodeName:}" failed. No retries permitted until 2026-02-26 08:45:46.201846519 +0000 UTC m=+78.341420802 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs") pod "network-metrics-daemon-96njb" (UID: "c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701881 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-hostroot\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701887 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-etc-openvswitch\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701820 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-host-run-k8s-cni-cncf-io\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701897 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-host-cni-netd\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701919 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bdea74e1-d37c-4aa3-861e-6d4eab3a870a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6hpsj\" (UID: \"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701937 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-systemd-units\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701964 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/52bdf0f6-9afd-42fe-9a30-7da82bc7f619-proxy-tls\") pod \"machine-config-daemon-s6pcl\" (UID: \"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\") " pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701974 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-var-lib-openvswitch\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701982 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/52bdf0f6-9afd-42fe-9a30-7da82bc7f619-mcd-auth-proxy-config\") pod \"machine-config-daemon-s6pcl\" (UID: \"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\") " pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702030 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp4kb\" (UniqueName: \"kubernetes.io/projected/52bdf0f6-9afd-42fe-9a30-7da82bc7f619-kube-api-access-sp4kb\") pod \"machine-config-daemon-s6pcl\" (UID: \"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\") " pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702059 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-run-systemd\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702080 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-system-cni-dir\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702103 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-log-socket\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702108 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/90c32fd9-f2da-4ba9-8cf9-703b6de14025-cnibin\") pod \"multus-additional-cni-plugins-sz9t7\" (UID: \"90c32fd9-f2da-4ba9-8cf9-703b6de14025\") " pod="openshift-multus/multus-additional-cni-plugins-sz9t7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702163 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9d171f4c-5b3c-498e-acee-1725e2921586-hosts-file\") pod \"node-resolver-ztgdx\" (UID: \"9d171f4c-5b3c-498e-acee-1725e2921586\") " pod="openshift-dns/node-resolver-ztgdx" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702189 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-multus-cni-dir\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702214 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-os-release\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702239 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-host-cni-bin\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702260 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/03d05327-5621-49e8-8051-9dccd63f3a5b-cni-binary-copy\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702280 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-etc-kubernetes\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702283 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-run-ovn\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702293 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/90c32fd9-f2da-4ba9-8cf9-703b6de14025-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sz9t7\" (UID: \"90c32fd9-f2da-4ba9-8cf9-703b6de14025\") " pod="openshift-multus/multus-additional-cni-plugins-sz9t7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702300 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/90c32fd9-f2da-4ba9-8cf9-703b6de14025-os-release\") pod \"multus-additional-cni-plugins-sz9t7\" (UID: \"90c32fd9-f2da-4ba9-8cf9-703b6de14025\") " pod="openshift-multus/multus-additional-cni-plugins-sz9t7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702345 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/90c32fd9-f2da-4ba9-8cf9-703b6de14025-os-release\") pod \"multus-additional-cni-plugins-sz9t7\" (UID: \"90c32fd9-f2da-4ba9-8cf9-703b6de14025\") " pod="openshift-multus/multus-additional-cni-plugins-sz9t7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702351 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vdxb\" (UniqueName: \"kubernetes.io/projected/bdea74e1-d37c-4aa3-861e-6d4eab3a870a-kube-api-access-7vdxb\") pod \"ovnkube-control-plane-749d76644c-6hpsj\" (UID: \"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701732 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-run-openvswitch\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702396 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-log-socket\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702397 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/700c9110-ceb1-4a4a-8fea-0fca5904ff5d-host\") pod \"node-ca-n8ts9\" (UID: \"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\") " pod="openshift-image-registry/node-ca-n8ts9" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702427 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-host-run-netns\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702445 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9d171f4c-5b3c-498e-acee-1725e2921586-hosts-file\") pod \"node-resolver-ztgdx\" (UID: \"9d171f4c-5b3c-498e-acee-1725e2921586\") " pod="openshift-dns/node-resolver-ztgdx" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702492 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-multus-cni-dir\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702554 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-os-release\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702581 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702592 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-host-cni-bin\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702689 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/52bdf0f6-9afd-42fe-9a30-7da82bc7f619-mcd-auth-proxy-config\") pod \"machine-config-daemon-s6pcl\" (UID: \"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\") " pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.702729 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-systemd-units\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.703180 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/671c7a2c-75d0-4322-a581-3d1b26bda633-env-overrides\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.703192 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bdea74e1-d37c-4aa3-861e-6d4eab3a870a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6hpsj\" (UID: \"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.703241 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-host-run-netns\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701856 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-node-log\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701697 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/90c32fd9-f2da-4ba9-8cf9-703b6de14025-system-cni-dir\") pod \"multus-additional-cni-plugins-sz9t7\" (UID: \"90c32fd9-f2da-4ba9-8cf9-703b6de14025\") " pod="openshift-multus/multus-additional-cni-plugins-sz9t7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.701448 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.703298 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-etc-kubernetes\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.703435 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-cnibin\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.703510 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7bg6\" (UniqueName: \"kubernetes.io/projected/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-kube-api-access-w7bg6\") pod \"network-metrics-daemon-96njb\" (UID: \"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\") " pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.703590 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/671c7a2c-75d0-4322-a581-3d1b26bda633-ovnkube-config\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.703653 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-cnibin\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.703609 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/671c7a2c-75d0-4322-a581-3d1b26bda633-run-systemd\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.703720 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-host-var-lib-kubelet\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.703771 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/671c7a2c-75d0-4322-a581-3d1b26bda633-ovnkube-script-lib\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.703805 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/90c32fd9-f2da-4ba9-8cf9-703b6de14025-cni-binary-copy\") pod \"multus-additional-cni-plugins-sz9t7\" (UID: \"90c32fd9-f2da-4ba9-8cf9-703b6de14025\") " pod="openshift-multus/multus-additional-cni-plugins-sz9t7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.703847 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-system-cni-dir\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.703864 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bdea74e1-d37c-4aa3-861e-6d4eab3a870a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6hpsj\" (UID: \"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.704696 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/03d05327-5621-49e8-8051-9dccd63f3a5b-host-var-lib-kubelet\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.704775 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/90c32fd9-f2da-4ba9-8cf9-703b6de14025-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sz9t7\" (UID: \"90c32fd9-f2da-4ba9-8cf9-703b6de14025\") " pod="openshift-multus/multus-additional-cni-plugins-sz9t7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.704986 4925 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705015 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705090 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705113 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705134 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705154 4925 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705173 4925 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705192 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705212 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705231 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705251 4925 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705270 4925 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705289 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705309 4925 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705329 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705348 4925 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705366 4925 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705385 4925 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705404 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705423 4925 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705441 4925 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705459 4925 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705483 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705504 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705549 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705569 4925 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705589 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705608 4925 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705628 4925 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705648 4925 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705666 4925 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705685 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705710 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705731 4925 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705750 4925 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705771 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705800 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705820 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705840 4925 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705860 4925 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705879 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705898 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705920 4925 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705939 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705958 4925 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705980 4925 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706000 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706019 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706039 4925 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706057 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706075 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706095 4925 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706141 4925 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706161 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706181 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706221 4925 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706240 4925 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706261 4925 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706280 4925 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706299 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706318 4925 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706336 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706355 4925 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706374 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706394 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706413 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706432 4925 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706452 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706472 4925 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706490 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706508 4925 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.705752 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bdea74e1-d37c-4aa3-861e-6d4eab3a870a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6hpsj\" (UID: \"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706448 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/700c9110-ceb1-4a4a-8fea-0fca5904ff5d-serviceca\") pod \"node-ca-n8ts9\" (UID: \"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\") " pod="openshift-image-registry/node-ca-n8ts9" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706551 4925 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706608 4925 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706624 4925 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706644 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706657 4925 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706671 4925 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706685 4925 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706615 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/03d05327-5621-49e8-8051-9dccd63f3a5b-cni-binary-copy\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706715 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/90c32fd9-f2da-4ba9-8cf9-703b6de14025-cni-binary-copy\") pod \"multus-additional-cni-plugins-sz9t7\" (UID: \"90c32fd9-f2da-4ba9-8cf9-703b6de14025\") " pod="openshift-multus/multus-additional-cni-plugins-sz9t7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706698 4925 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706790 4925 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706808 4925 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706825 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706839 4925 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706852 4925 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706864 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706876 4925 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706889 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706903 4925 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706914 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706926 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706938 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706950 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706961 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706973 4925 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706986 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.706998 4925 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707009 4925 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707019 4925 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707030 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707042 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707053 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707064 4925 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707076 4925 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707087 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707085 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bdea74e1-d37c-4aa3-861e-6d4eab3a870a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6hpsj\" (UID: \"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707100 4925 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707112 4925 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707125 4925 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707136 4925 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707147 4925 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707157 4925 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707174 4925 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707185 4925 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707195 4925 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707206 4925 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707220 4925 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707233 4925 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707245 4925 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707256 4925 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707267 4925 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707280 4925 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707290 4925 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707301 4925 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707312 4925 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707323 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707334 4925 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707354 4925 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707363 4925 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707373 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707388 4925 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707398 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707410 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707421 4925 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707431 4925 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707441 4925 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707452 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707462 4925 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707472 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707484 4925 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707495 4925 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707507 4925 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707517 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707541 4925 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707551 4925 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707561 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707572 4925 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707581 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707592 4925 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707602 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707612 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707621 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707632 4925 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707642 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707653 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707663 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707673 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707684 4925 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707694 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707704 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707713 4925 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707723 4925 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707733 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707742 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707751 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707761 4925 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707772 4925 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707783 4925 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707792 4925 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707802 4925 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707812 4925 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707809 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/52bdf0f6-9afd-42fe-9a30-7da82bc7f619-proxy-tls\") pod \"machine-config-daemon-s6pcl\" (UID: \"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\") " pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707822 4925 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707879 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707893 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707904 4925 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707915 4925 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707925 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707934 4925 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707944 4925 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707953 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707963 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707971 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.707981 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.709578 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/671c7a2c-75d0-4322-a581-3d1b26bda633-ovn-node-metrics-cert\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.712844 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/03d05327-5621-49e8-8051-9dccd63f3a5b-multus-daemon-config\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.715878 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.715923 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.715945 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.715977 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.716001 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:45Z","lastTransitionTime":"2026-02-26T08:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.716305 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrhqb\" (UniqueName: \"kubernetes.io/projected/03d05327-5621-49e8-8051-9dccd63f3a5b-kube-api-access-jrhqb\") pod \"multus-fmxxp\" (UID: \"03d05327-5621-49e8-8051-9dccd63f3a5b\") " pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.720513 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjl2n\" (UniqueName: \"kubernetes.io/projected/9d171f4c-5b3c-498e-acee-1725e2921586-kube-api-access-kjl2n\") pod \"node-resolver-ztgdx\" (UID: \"9d171f4c-5b3c-498e-acee-1725e2921586\") " pod="openshift-dns/node-resolver-ztgdx" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.723887 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rngdj\" (UniqueName: \"kubernetes.io/projected/90c32fd9-f2da-4ba9-8cf9-703b6de14025-kube-api-access-rngdj\") pod \"multus-additional-cni-plugins-sz9t7\" (UID: \"90c32fd9-f2da-4ba9-8cf9-703b6de14025\") " pod="openshift-multus/multus-additional-cni-plugins-sz9t7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.725356 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htmnv\" (UniqueName: \"kubernetes.io/projected/671c7a2c-75d0-4322-a581-3d1b26bda633-kube-api-access-htmnv\") pod \"ovnkube-node-pv859\" (UID: \"671c7a2c-75d0-4322-a581-3d1b26bda633\") " pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.726345 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vdxb\" (UniqueName: \"kubernetes.io/projected/bdea74e1-d37c-4aa3-861e-6d4eab3a870a-kube-api-access-7vdxb\") pod \"ovnkube-control-plane-749d76644c-6hpsj\" (UID: \"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.730053 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwkvp\" (UniqueName: \"kubernetes.io/projected/700c9110-ceb1-4a4a-8fea-0fca5904ff5d-kube-api-access-fwkvp\") pod \"node-ca-n8ts9\" (UID: \"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\") " pod="openshift-image-registry/node-ca-n8ts9" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.731975 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp4kb\" (UniqueName: \"kubernetes.io/projected/52bdf0f6-9afd-42fe-9a30-7da82bc7f619-kube-api-access-sp4kb\") pod \"machine-config-daemon-s6pcl\" (UID: \"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\") " pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.733192 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7bg6\" (UniqueName: \"kubernetes.io/projected/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-kube-api-access-w7bg6\") pod \"network-metrics-daemon-96njb\" (UID: \"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\") " pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.819146 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.819646 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.819750 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.819812 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.819915 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.819996 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:45Z","lastTransitionTime":"2026-02-26T08:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.827332 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.839285 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 08:45:45 crc kubenswrapper[4925]: W0226 08:45:45.844004 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-a5b239656c1fe9963d9df9b297a02c77dbd51cf9cca9ff32ea68f267d33c3172 WatchSource:0}: Error finding container a5b239656c1fe9963d9df9b297a02c77dbd51cf9cca9ff32ea68f267d33c3172: Status 404 returned error can't find the container with id a5b239656c1fe9963d9df9b297a02c77dbd51cf9cca9ff32ea68f267d33c3172 Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.849776 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n8ts9" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.859486 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ztgdx" Feb 26 08:45:45 crc kubenswrapper[4925]: W0226 08:45:45.865215 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-cf7dc54bd6b900f9205cb9d2693b38411cf8406e243afbb52b87aec846773811 WatchSource:0}: Error finding container cf7dc54bd6b900f9205cb9d2693b38411cf8406e243afbb52b87aec846773811: Status 404 returned error can't find the container with id cf7dc54bd6b900f9205cb9d2693b38411cf8406e243afbb52b87aec846773811 Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.871064 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.873990 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fmxxp" Feb 26 08:45:45 crc kubenswrapper[4925]: W0226 08:45:45.874293 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod700c9110_ceb1_4a4a_8fea_0fca5904ff5d.slice/crio-ba5dad65dd9f1e776d8331d8f56646b4f313cf320de576de1480364e01800207 WatchSource:0}: Error finding container ba5dad65dd9f1e776d8331d8f56646b4f313cf320de576de1480364e01800207: Status 404 returned error can't find the container with id ba5dad65dd9f1e776d8331d8f56646b4f313cf320de576de1480364e01800207 Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.880896 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" Feb 26 08:45:45 crc kubenswrapper[4925]: W0226 08:45:45.894129 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d171f4c_5b3c_498e_acee_1725e2921586.slice/crio-05b9624594fc1c3ff2e35650083a560b5ac3276dafca4077c71619e9c8857044 WatchSource:0}: Error finding container 05b9624594fc1c3ff2e35650083a560b5ac3276dafca4077c71619e9c8857044: Status 404 returned error can't find the container with id 05b9624594fc1c3ff2e35650083a560b5ac3276dafca4077c71619e9c8857044 Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.898193 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.913342 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.922892 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.922956 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.922978 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.923004 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:45 crc kubenswrapper[4925]: I0226 08:45:45.923023 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:45Z","lastTransitionTime":"2026-02-26T08:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:45 crc kubenswrapper[4925]: W0226 08:45:45.933451 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03d05327_5621_49e8_8051_9dccd63f3a5b.slice/crio-1add7085d265eca0306966f10e7ec52b653c03fc3aeb3e428ddc5d73e037779d WatchSource:0}: Error finding container 1add7085d265eca0306966f10e7ec52b653c03fc3aeb3e428ddc5d73e037779d: Status 404 returned error can't find the container with id 1add7085d265eca0306966f10e7ec52b653c03fc3aeb3e428ddc5d73e037779d Feb 26 08:45:45 crc kubenswrapper[4925]: W0226 08:45:45.950862 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdea74e1_d37c_4aa3_861e_6d4eab3a870a.slice/crio-6444295b08a6f2a508de68f5f590a04f7e8edaeff124df07f51f34495c72e89e WatchSource:0}: Error finding container 6444295b08a6f2a508de68f5f590a04f7e8edaeff124df07f51f34495c72e89e: Status 404 returned error can't find the container with id 6444295b08a6f2a508de68f5f590a04f7e8edaeff124df07f51f34495c72e89e Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.025974 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.026478 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.026491 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.026511 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.026538 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:46Z","lastTransitionTime":"2026-02-26T08:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.112757 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.112879 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.112908 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:45:46 crc kubenswrapper[4925]: E0226 08:45:46.112935 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:45:47.112908427 +0000 UTC m=+79.252482710 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:45:46 crc kubenswrapper[4925]: E0226 08:45:46.113014 4925 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 08:45:46 crc kubenswrapper[4925]: E0226 08:45:46.113051 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 08:45:47.11303965 +0000 UTC m=+79.252613933 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 08:45:46 crc kubenswrapper[4925]: E0226 08:45:46.113058 4925 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 08:45:46 crc kubenswrapper[4925]: E0226 08:45:46.113130 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 08:45:47.113113502 +0000 UTC m=+79.252687785 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.129287 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.129331 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.129342 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.129360 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.129371 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:46Z","lastTransitionTime":"2026-02-26T08:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.213627 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs\") pod \"network-metrics-daemon-96njb\" (UID: \"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\") " pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.213704 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.213732 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:45:46 crc kubenswrapper[4925]: E0226 08:45:46.213823 4925 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 08:45:46 crc kubenswrapper[4925]: E0226 08:45:46.213886 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 08:45:46 crc kubenswrapper[4925]: E0226 08:45:46.213908 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 08:45:46 crc kubenswrapper[4925]: E0226 08:45:46.213911 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs podName:c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc nodeName:}" failed. No retries permitted until 2026-02-26 08:45:47.213891858 +0000 UTC m=+79.353466141 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs") pod "network-metrics-daemon-96njb" (UID: "c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 08:45:46 crc kubenswrapper[4925]: E0226 08:45:46.213924 4925 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:45:46 crc kubenswrapper[4925]: E0226 08:45:46.213933 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 08:45:46 crc kubenswrapper[4925]: E0226 08:45:46.213960 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 08:45:46 crc kubenswrapper[4925]: E0226 08:45:46.213975 4925 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:45:46 crc kubenswrapper[4925]: E0226 08:45:46.213988 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 08:45:47.213969591 +0000 UTC m=+79.353543874 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:45:46 crc kubenswrapper[4925]: E0226 08:45:46.214066 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 08:45:47.214017902 +0000 UTC m=+79.353592395 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.230986 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.231013 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.231021 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.231034 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.231044 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:46Z","lastTransitionTime":"2026-02-26T08:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.337069 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.337142 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.337159 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.337189 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.337206 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:46Z","lastTransitionTime":"2026-02-26T08:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.447316 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.447375 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.447391 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.447408 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.447420 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:46Z","lastTransitionTime":"2026-02-26T08:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.510663 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.511385 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.512661 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.513398 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.514756 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.515292 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.515896 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.516885 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.517500 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.518452 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.518978 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.520012 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.520608 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.521114 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.522101 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.522698 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.523640 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.524060 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.524619 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.525700 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.526181 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.527304 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.527842 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.528912 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.529432 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.530120 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.531419 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.531961 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.532654 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.533141 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.533673 4925 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.533774 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.535205 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.535847 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.536308 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.537622 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.538286 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.538960 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.539678 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.540343 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.541862 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.542940 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.543906 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.544691 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.545212 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.545761 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.547514 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.548553 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.549827 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.550436 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.551468 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.552145 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.553120 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.554147 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.554629 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.554670 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.554683 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.554700 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.554711 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:46Z","lastTransitionTime":"2026-02-26T08:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.656703 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.656746 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.656760 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.656782 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.656798 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:46Z","lastTransitionTime":"2026-02-26T08:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.759625 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.759663 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.759672 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.759686 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.759697 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:46Z","lastTransitionTime":"2026-02-26T08:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.811489 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n8ts9" event={"ID":"700c9110-ceb1-4a4a-8fea-0fca5904ff5d","Type":"ContainerStarted","Data":"c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.811607 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n8ts9" event={"ID":"700c9110-ceb1-4a4a-8fea-0fca5904ff5d","Type":"ContainerStarted","Data":"ba5dad65dd9f1e776d8331d8f56646b4f313cf320de576de1480364e01800207"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.814388 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fmxxp" event={"ID":"03d05327-5621-49e8-8051-9dccd63f3a5b","Type":"ContainerStarted","Data":"5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.814445 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fmxxp" event={"ID":"03d05327-5621-49e8-8051-9dccd63f3a5b","Type":"ContainerStarted","Data":"1add7085d265eca0306966f10e7ec52b653c03fc3aeb3e428ddc5d73e037779d"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.816332 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ztgdx" event={"ID":"9d171f4c-5b3c-498e-acee-1725e2921586","Type":"ContainerStarted","Data":"9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.816410 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ztgdx" event={"ID":"9d171f4c-5b3c-498e-acee-1725e2921586","Type":"ContainerStarted","Data":"05b9624594fc1c3ff2e35650083a560b5ac3276dafca4077c71619e9c8857044"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.819179 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" event={"ID":"52bdf0f6-9afd-42fe-9a30-7da82bc7f619","Type":"ContainerStarted","Data":"84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.819238 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" event={"ID":"52bdf0f6-9afd-42fe-9a30-7da82bc7f619","Type":"ContainerStarted","Data":"767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.819254 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" event={"ID":"52bdf0f6-9afd-42fe-9a30-7da82bc7f619","Type":"ContainerStarted","Data":"850500d73aa1e0fef0820f983ba77f86eb62000c2434341a6997aae70e228737"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.822614 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.822659 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.822678 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a5b239656c1fe9963d9df9b297a02c77dbd51cf9cca9ff32ea68f267d33c3172"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.824707 4925 generic.go:334] "Generic (PLEG): container finished" podID="90c32fd9-f2da-4ba9-8cf9-703b6de14025" containerID="6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8" exitCode=0 Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.824791 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" event={"ID":"90c32fd9-f2da-4ba9-8cf9-703b6de14025","Type":"ContainerDied","Data":"6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.824850 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" event={"ID":"90c32fd9-f2da-4ba9-8cf9-703b6de14025","Type":"ContainerStarted","Data":"54d675994f8e73b5bf25a9917a7ff3497a99ffedb8fa541952e67cf461609b4d"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.836483 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:46Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.836672 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" event={"ID":"bdea74e1-d37c-4aa3-861e-6d4eab3a870a","Type":"ContainerStarted","Data":"37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.836733 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" event={"ID":"bdea74e1-d37c-4aa3-861e-6d4eab3a870a","Type":"ContainerStarted","Data":"8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.836759 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" event={"ID":"bdea74e1-d37c-4aa3-861e-6d4eab3a870a","Type":"ContainerStarted","Data":"6444295b08a6f2a508de68f5f590a04f7e8edaeff124df07f51f34495c72e89e"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.839920 4925 generic.go:334] "Generic (PLEG): container finished" podID="671c7a2c-75d0-4322-a581-3d1b26bda633" containerID="723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366" exitCode=0 Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.840066 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" event={"ID":"671c7a2c-75d0-4322-a581-3d1b26bda633","Type":"ContainerDied","Data":"723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.840132 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" event={"ID":"671c7a2c-75d0-4322-a581-3d1b26bda633","Type":"ContainerStarted","Data":"291061c5f69df2937e958e173a9034835633bc5c33791e8c7b81d93f6a39c59a"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.841734 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"cf7dc54bd6b900f9205cb9d2693b38411cf8406e243afbb52b87aec846773811"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.844388 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.844456 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1f24e3f8fd10415d8c807a6debf3e5637870b07df5b2126dbd484ab63b69bf27"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.859460 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:46Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.862720 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.862754 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.862762 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.862800 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.862811 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:46Z","lastTransitionTime":"2026-02-26T08:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.880594 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:46Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.896365 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:46Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.929195 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:46Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.942804 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:46Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.956253 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:46Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.965962 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.966002 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.966160 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.966200 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.966215 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:46Z","lastTransitionTime":"2026-02-26T08:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.968332 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:46Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.981475 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:46Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:46 crc kubenswrapper[4925]: I0226 08:45:46.998093 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:46Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.013310 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.026633 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.042020 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.053311 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.065806 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.074203 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.074247 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.074256 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.074271 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.074281 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:47Z","lastTransitionTime":"2026-02-26T08:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.077957 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.103408 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.114619 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.123195 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:45:47 crc kubenswrapper[4925]: E0226 08:45:47.123381 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:45:49.123349091 +0000 UTC m=+81.262923374 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.123440 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.123537 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:45:47 crc kubenswrapper[4925]: E0226 08:45:47.123587 4925 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 08:45:47 crc kubenswrapper[4925]: E0226 08:45:47.123678 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 08:45:49.123653959 +0000 UTC m=+81.263228422 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 08:45:47 crc kubenswrapper[4925]: E0226 08:45:47.123691 4925 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 08:45:47 crc kubenswrapper[4925]: E0226 08:45:47.123733 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 08:45:49.123726341 +0000 UTC m=+81.263300624 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.132095 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.144129 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.157507 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.173453 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.175403 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.175435 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.175443 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.175457 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.175466 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:47Z","lastTransitionTime":"2026-02-26T08:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.190610 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.208418 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.224399 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.224486 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.224553 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs\") pod \"network-metrics-daemon-96njb\" (UID: \"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\") " pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.224385 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: E0226 08:45:47.224596 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 08:45:47 crc kubenswrapper[4925]: E0226 08:45:47.224626 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 08:45:47 crc kubenswrapper[4925]: E0226 08:45:47.224647 4925 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:45:47 crc kubenswrapper[4925]: E0226 08:45:47.224694 4925 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 08:45:47 crc kubenswrapper[4925]: E0226 08:45:47.224722 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 08:45:49.224698413 +0000 UTC m=+81.364272706 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:45:47 crc kubenswrapper[4925]: E0226 08:45:47.224775 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs podName:c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc nodeName:}" failed. No retries permitted until 2026-02-26 08:45:49.224754214 +0000 UTC m=+81.364328587 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs") pod "network-metrics-daemon-96njb" (UID: "c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 08:45:47 crc kubenswrapper[4925]: E0226 08:45:47.224698 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 08:45:47 crc kubenswrapper[4925]: E0226 08:45:47.224814 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 08:45:47 crc kubenswrapper[4925]: E0226 08:45:47.224831 4925 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:45:47 crc kubenswrapper[4925]: E0226 08:45:47.224900 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 08:45:49.224880408 +0000 UTC m=+81.364454751 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.236002 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.245092 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.262986 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.278048 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.278102 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.278114 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.278131 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.278143 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:47Z","lastTransitionTime":"2026-02-26T08:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.380121 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.380156 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.380164 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.380179 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.380188 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:47Z","lastTransitionTime":"2026-02-26T08:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.484039 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.484080 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.484092 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.484107 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.484118 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:47Z","lastTransitionTime":"2026-02-26T08:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.506891 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.506894 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.506908 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:45:47 crc kubenswrapper[4925]: E0226 08:45:47.507005 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.507118 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:45:47 crc kubenswrapper[4925]: E0226 08:45:47.507302 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:45:47 crc kubenswrapper[4925]: E0226 08:45:47.507390 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:45:47 crc kubenswrapper[4925]: E0226 08:45:47.507434 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.586554 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.586599 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.586614 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.586632 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.586643 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:47Z","lastTransitionTime":"2026-02-26T08:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.691647 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.691710 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.691732 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.691763 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.691789 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:47Z","lastTransitionTime":"2026-02-26T08:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.795382 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.795449 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.795471 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.795499 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.795556 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:47Z","lastTransitionTime":"2026-02-26T08:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.851219 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" event={"ID":"90c32fd9-f2da-4ba9-8cf9-703b6de14025","Type":"ContainerStarted","Data":"984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa"} Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.860810 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" event={"ID":"671c7a2c-75d0-4322-a581-3d1b26bda633","Type":"ContainerStarted","Data":"dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f"} Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.860894 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" event={"ID":"671c7a2c-75d0-4322-a581-3d1b26bda633","Type":"ContainerStarted","Data":"0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76"} Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.860944 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" event={"ID":"671c7a2c-75d0-4322-a581-3d1b26bda633","Type":"ContainerStarted","Data":"170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526"} Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.860970 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" event={"ID":"671c7a2c-75d0-4322-a581-3d1b26bda633","Type":"ContainerStarted","Data":"7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d"} Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.860997 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" event={"ID":"671c7a2c-75d0-4322-a581-3d1b26bda633","Type":"ContainerStarted","Data":"656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca"} Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.874745 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.898638 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.900241 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.900309 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.900332 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.900363 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.900389 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:47Z","lastTransitionTime":"2026-02-26T08:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.922149 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.940857 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.961680 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.974911 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:47 crc kubenswrapper[4925]: I0226 08:45:47.990413 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.003597 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.003908 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.004024 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.004126 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.004204 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:48Z","lastTransitionTime":"2026-02-26T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.006876 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.021646 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.035188 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.054084 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.070772 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.092246 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.106859 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.106885 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.106893 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.106905 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.106915 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:48Z","lastTransitionTime":"2026-02-26T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.108970 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.209295 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.209326 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.209337 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.209353 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.209364 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:48Z","lastTransitionTime":"2026-02-26T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.311452 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.311488 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.311497 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.311509 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.311540 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:48Z","lastTransitionTime":"2026-02-26T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.413782 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.413820 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.413828 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.413840 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.413849 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:48Z","lastTransitionTime":"2026-02-26T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.516095 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.516165 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.516181 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.516203 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.516218 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:48Z","lastTransitionTime":"2026-02-26T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.521571 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.538177 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.554129 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.571579 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.585580 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.596414 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.605368 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.619042 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.619094 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.619111 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.619136 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.619152 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:48Z","lastTransitionTime":"2026-02-26T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.621590 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.632768 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.647705 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.658568 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.671661 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.671837 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.671931 4925 scope.go:117] "RemoveContainer" containerID="450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011" Feb 26 08:45:48 crc kubenswrapper[4925]: E0226 08:45:48.672314 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.686040 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.702381 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.718091 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.721924 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.721970 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.721983 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.722004 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.722027 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:48Z","lastTransitionTime":"2026-02-26T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.824810 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.824876 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.824897 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.824927 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.824951 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:48Z","lastTransitionTime":"2026-02-26T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.865945 4925 generic.go:334] "Generic (PLEG): container finished" podID="90c32fd9-f2da-4ba9-8cf9-703b6de14025" containerID="984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa" exitCode=0 Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.866004 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" event={"ID":"90c32fd9-f2da-4ba9-8cf9-703b6de14025","Type":"ContainerDied","Data":"984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa"} Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.872699 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" event={"ID":"671c7a2c-75d0-4322-a581-3d1b26bda633","Type":"ContainerStarted","Data":"77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6"} Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.873236 4925 scope.go:117] "RemoveContainer" containerID="450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011" Feb 26 08:45:48 crc kubenswrapper[4925]: E0226 08:45:48.873411 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.884517 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.905738 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.924328 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.927513 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.927773 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.927954 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.928093 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.928215 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:48Z","lastTransitionTime":"2026-02-26T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.929667 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.929734 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.929755 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.929779 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.929796 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:48Z","lastTransitionTime":"2026-02-26T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.945238 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: E0226 08:45:48.949636 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.955051 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.955113 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.955127 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.955149 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.955167 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:48Z","lastTransitionTime":"2026-02-26T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.964322 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: E0226 08:45:48.969800 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.974262 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.974331 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.974351 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.974377 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.974396 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:48Z","lastTransitionTime":"2026-02-26T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.978727 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: E0226 08:45:48.986845 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.990047 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.990080 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.990088 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.990102 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.990111 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:48Z","lastTransitionTime":"2026-02-26T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:48 crc kubenswrapper[4925]: I0226 08:45:48.993851 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:49 crc kubenswrapper[4925]: E0226 08:45:49.003488 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.007624 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.007686 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.007712 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.007742 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.007764 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:49Z","lastTransitionTime":"2026-02-26T08:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.011611 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.020925 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:49 crc kubenswrapper[4925]: E0226 08:45:49.023880 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:49 crc kubenswrapper[4925]: E0226 08:45:49.024057 4925 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.031083 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.031133 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.031144 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.031163 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.031177 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:49Z","lastTransitionTime":"2026-02-26T08:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.034389 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.043377 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.054396 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.066632 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.077812 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.091505 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.134896 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.134922 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.134929 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.134942 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.134950 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:49Z","lastTransitionTime":"2026-02-26T08:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.148573 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.148717 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:45:49 crc kubenswrapper[4925]: E0226 08:45:49.148735 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:45:53.148710449 +0000 UTC m=+85.288284742 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.148776 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:45:49 crc kubenswrapper[4925]: E0226 08:45:49.148806 4925 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 08:45:49 crc kubenswrapper[4925]: E0226 08:45:49.148877 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 08:45:53.148861623 +0000 UTC m=+85.288435916 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 08:45:49 crc kubenswrapper[4925]: E0226 08:45:49.148892 4925 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 08:45:49 crc kubenswrapper[4925]: E0226 08:45:49.148940 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 08:45:53.148928925 +0000 UTC m=+85.288503288 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.237561 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.237596 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.237605 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.237621 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.237630 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:49Z","lastTransitionTime":"2026-02-26T08:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.250429 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs\") pod \"network-metrics-daemon-96njb\" (UID: \"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\") " pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.250550 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.250589 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:45:49 crc kubenswrapper[4925]: E0226 08:45:49.250754 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 08:45:49 crc kubenswrapper[4925]: E0226 08:45:49.250788 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 08:45:49 crc kubenswrapper[4925]: E0226 08:45:49.250805 4925 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:45:49 crc kubenswrapper[4925]: E0226 08:45:49.250867 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 08:45:49 crc kubenswrapper[4925]: E0226 08:45:49.250902 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 08:45:49 crc kubenswrapper[4925]: E0226 08:45:49.250919 4925 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:45:49 crc kubenswrapper[4925]: E0226 08:45:49.250878 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 08:45:53.250855362 +0000 UTC m=+85.390429665 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:45:49 crc kubenswrapper[4925]: E0226 08:45:49.250979 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 08:45:53.250964955 +0000 UTC m=+85.390539258 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:45:49 crc kubenswrapper[4925]: E0226 08:45:49.251165 4925 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 08:45:49 crc kubenswrapper[4925]: E0226 08:45:49.251386 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs podName:c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc nodeName:}" failed. No retries permitted until 2026-02-26 08:45:53.251365375 +0000 UTC m=+85.390939658 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs") pod "network-metrics-daemon-96njb" (UID: "c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.340620 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.340968 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.340979 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.340996 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.341009 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:49Z","lastTransitionTime":"2026-02-26T08:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.443663 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.443696 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.443708 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.443723 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.443734 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:49Z","lastTransitionTime":"2026-02-26T08:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.507049 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.507065 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.507075 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.507236 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:45:49 crc kubenswrapper[4925]: E0226 08:45:49.507387 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:45:49 crc kubenswrapper[4925]: E0226 08:45:49.507814 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:45:49 crc kubenswrapper[4925]: E0226 08:45:49.507986 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:45:49 crc kubenswrapper[4925]: E0226 08:45:49.508124 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.546450 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.546489 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.546501 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.546515 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.546549 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:49Z","lastTransitionTime":"2026-02-26T08:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.648927 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.648965 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.648974 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.648986 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.648995 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:49Z","lastTransitionTime":"2026-02-26T08:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.752431 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.752506 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.752562 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.752595 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.752620 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:49Z","lastTransitionTime":"2026-02-26T08:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.855465 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.855600 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.855628 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.855660 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.855686 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:49Z","lastTransitionTime":"2026-02-26T08:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.879238 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0"} Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.882879 4925 generic.go:334] "Generic (PLEG): container finished" podID="90c32fd9-f2da-4ba9-8cf9-703b6de14025" containerID="963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae" exitCode=0 Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.882943 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" event={"ID":"90c32fd9-f2da-4ba9-8cf9-703b6de14025","Type":"ContainerDied","Data":"963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae"} Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.897412 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.921428 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.944824 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.957937 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.957977 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.957991 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.958016 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.958033 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:49Z","lastTransitionTime":"2026-02-26T08:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.968948 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:49 crc kubenswrapper[4925]: I0226 08:45:49.986735 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.004321 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.021231 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.038755 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.057038 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.061298 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.061350 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.061365 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.061385 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.061399 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:50Z","lastTransitionTime":"2026-02-26T08:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.078160 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.096268 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.113512 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.126754 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.142142 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.158023 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.164587 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.164634 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.164647 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.164664 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.164676 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:50Z","lastTransitionTime":"2026-02-26T08:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.182253 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.195595 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.209770 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.223022 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.236175 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.248451 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.263654 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.267387 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.267422 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.267439 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.267462 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.267480 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:50Z","lastTransitionTime":"2026-02-26T08:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.279093 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.294951 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.311218 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.323649 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.341628 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.357033 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.370102 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.370136 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.370145 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.370159 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.370168 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:50Z","lastTransitionTime":"2026-02-26T08:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.372669 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.395191 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.473440 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.473518 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.473575 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.473608 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.473631 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:50Z","lastTransitionTime":"2026-02-26T08:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.575959 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.576001 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.576013 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.576028 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.576040 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:50Z","lastTransitionTime":"2026-02-26T08:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.679208 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.679260 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.679276 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.679297 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.679313 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:50Z","lastTransitionTime":"2026-02-26T08:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.782615 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.782690 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.782709 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.782734 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.782755 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:50Z","lastTransitionTime":"2026-02-26T08:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.886260 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.886324 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.886344 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.886367 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.886384 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:50Z","lastTransitionTime":"2026-02-26T08:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.889469 4925 generic.go:334] "Generic (PLEG): container finished" podID="90c32fd9-f2da-4ba9-8cf9-703b6de14025" containerID="d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9" exitCode=0 Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.889589 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" event={"ID":"90c32fd9-f2da-4ba9-8cf9-703b6de14025","Type":"ContainerDied","Data":"d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9"} Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.895453 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" event={"ID":"671c7a2c-75d0-4322-a581-3d1b26bda633","Type":"ContainerStarted","Data":"43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f"} Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.920977 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.935363 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.955268 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.971304 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.984140 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.988494 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.988563 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.988580 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.988605 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:50 crc kubenswrapper[4925]: I0226 08:45:50.988621 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:50Z","lastTransitionTime":"2026-02-26T08:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.008248 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:51Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.027869 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:51Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.062845 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:51Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.073126 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:51Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.082793 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:51Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.091154 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.091195 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.091207 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.091224 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.091236 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:51Z","lastTransitionTime":"2026-02-26T08:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.092859 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:51Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.109991 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:51Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.122951 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:51Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.134752 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:51Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.146797 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:51Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.194335 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.194381 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.194395 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.194412 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.194433 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:51Z","lastTransitionTime":"2026-02-26T08:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.297519 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.297591 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.297607 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.297630 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.297646 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:51Z","lastTransitionTime":"2026-02-26T08:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.399798 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.399838 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.399848 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.399863 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.399874 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:51Z","lastTransitionTime":"2026-02-26T08:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.502476 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.502548 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.502561 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.502579 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.502593 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:51Z","lastTransitionTime":"2026-02-26T08:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.506009 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.506052 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:45:51 crc kubenswrapper[4925]: E0226 08:45:51.506113 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.506114 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.506148 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:45:51 crc kubenswrapper[4925]: E0226 08:45:51.506241 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:45:51 crc kubenswrapper[4925]: E0226 08:45:51.506429 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:45:51 crc kubenswrapper[4925]: E0226 08:45:51.506497 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.605974 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.606050 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.606074 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.606104 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.606127 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:51Z","lastTransitionTime":"2026-02-26T08:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.709465 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.709562 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.709580 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.709611 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.709629 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:51Z","lastTransitionTime":"2026-02-26T08:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.813039 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.813108 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.813127 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.813180 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.813198 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:51Z","lastTransitionTime":"2026-02-26T08:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.905326 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" event={"ID":"90c32fd9-f2da-4ba9-8cf9-703b6de14025","Type":"ContainerStarted","Data":"fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334"} Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.917416 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.917634 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.917664 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.917695 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.917718 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:51Z","lastTransitionTime":"2026-02-26T08:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.921825 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:51Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.939195 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:51Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.955313 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:51Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.971026 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:51Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:51 crc kubenswrapper[4925]: I0226 08:45:51.987424 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:51Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.005552 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:52Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.020368 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.020416 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.020429 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.020448 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.020461 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:52Z","lastTransitionTime":"2026-02-26T08:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.028648 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:52Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.043362 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:52Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.056194 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:52Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.070327 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:52Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.090368 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:52Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.106277 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:52Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.123608 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.123756 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.123769 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.123785 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.123803 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:52Z","lastTransitionTime":"2026-02-26T08:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.125216 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:52Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.138164 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:52Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.149471 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:52Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.226247 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.226658 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.226666 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.226680 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.226690 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:52Z","lastTransitionTime":"2026-02-26T08:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.330304 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.330394 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.330417 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.330443 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.330461 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:52Z","lastTransitionTime":"2026-02-26T08:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.434135 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.434213 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.434236 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.434265 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.434286 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:52Z","lastTransitionTime":"2026-02-26T08:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.537386 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.537446 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.537464 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.537489 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.537511 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:52Z","lastTransitionTime":"2026-02-26T08:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.640753 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.640810 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.640830 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.640861 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.640880 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:52Z","lastTransitionTime":"2026-02-26T08:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.743848 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.743918 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.743940 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.743972 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.743994 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:52Z","lastTransitionTime":"2026-02-26T08:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.779015 4925 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.846941 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.846996 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.847012 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.847036 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.847054 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:52Z","lastTransitionTime":"2026-02-26T08:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.913493 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" event={"ID":"671c7a2c-75d0-4322-a581-3d1b26bda633","Type":"ContainerStarted","Data":"f6b5fbd478d6eacf67094af8fcf05a96aab9d5200a313d948397965b80658b72"} Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.914610 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.914815 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.914885 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.920520 4925 generic.go:334] "Generic (PLEG): container finished" podID="90c32fd9-f2da-4ba9-8cf9-703b6de14025" containerID="fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334" exitCode=0 Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.920582 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" event={"ID":"90c32fd9-f2da-4ba9-8cf9-703b6de14025","Type":"ContainerDied","Data":"fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334"} Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.930686 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:52Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.949089 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:52Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.949769 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.949816 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.949831 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.949852 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.949870 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:52Z","lastTransitionTime":"2026-02-26T08:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.953442 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.957680 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.965516 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:52Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.979072 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:52Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:52 crc kubenswrapper[4925]: I0226 08:45:52.992180 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:52Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.012409 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.032270 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5fbd478d6eacf67094af8fcf05a96aab9d5200a313d948397965b80658b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.046693 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.052234 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.052271 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.052284 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.052301 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.052313 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:53Z","lastTransitionTime":"2026-02-26T08:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.058295 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.070776 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.086345 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.100720 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.491119 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.491190 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.491212 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.491234 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.491257 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs\") pod \"network-metrics-daemon-96njb\" (UID: \"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\") " pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.491283 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:45:53 crc kubenswrapper[4925]: E0226 08:45:53.491361 4925 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 08:45:53 crc kubenswrapper[4925]: E0226 08:45:53.491417 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 08:46:01.491391923 +0000 UTC m=+93.630966196 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 08:45:53 crc kubenswrapper[4925]: E0226 08:45:53.491610 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 08:45:53 crc kubenswrapper[4925]: E0226 08:45:53.491636 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 08:45:53 crc kubenswrapper[4925]: E0226 08:45:53.491649 4925 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:45:53 crc kubenswrapper[4925]: E0226 08:45:53.491668 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:46:01.491587218 +0000 UTC m=+93.631161501 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:45:53 crc kubenswrapper[4925]: E0226 08:45:53.491677 4925 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 08:45:53 crc kubenswrapper[4925]: E0226 08:45:53.491698 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 08:46:01.491690831 +0000 UTC m=+93.631265114 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:45:53 crc kubenswrapper[4925]: E0226 08:45:53.491726 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 08:45:53 crc kubenswrapper[4925]: E0226 08:45:53.491738 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 08:45:53 crc kubenswrapper[4925]: E0226 08:45:53.491745 4925 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:45:53 crc kubenswrapper[4925]: E0226 08:45:53.491753 4925 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 08:45:53 crc kubenswrapper[4925]: E0226 08:45:53.491775 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 08:46:01.491745322 +0000 UTC m=+93.631319625 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 08:45:53 crc kubenswrapper[4925]: E0226 08:45:53.491863 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 08:46:01.491842095 +0000 UTC m=+93.631416378 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:45:53 crc kubenswrapper[4925]: E0226 08:45:53.491883 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs podName:c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc nodeName:}" failed. No retries permitted until 2026-02-26 08:46:01.491872826 +0000 UTC m=+93.631447109 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs") pod "network-metrics-daemon-96njb" (UID: "c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.493829 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.493872 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.493880 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.493894 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.493907 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:53Z","lastTransitionTime":"2026-02-26T08:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.502213 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.506551 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:45:53 crc kubenswrapper[4925]: E0226 08:45:53.506658 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.506722 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:45:53 crc kubenswrapper[4925]: E0226 08:45:53.506771 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.506828 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:45:53 crc kubenswrapper[4925]: E0226 08:45:53.506885 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.506935 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:45:53 crc kubenswrapper[4925]: E0226 08:45:53.506981 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.512126 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.524948 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.537207 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.551955 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.564585 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.582572 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.596501 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.596550 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.596561 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.596579 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.596594 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:53Z","lastTransitionTime":"2026-02-26T08:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.604101 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.616287 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.630162 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.651834 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5fbd478d6eacf67094af8fcf05a96aab9d5200a313d948397965b80658b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.663841 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.676988 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.689555 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.699215 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.699260 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.699276 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.699302 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.699317 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:53Z","lastTransitionTime":"2026-02-26T08:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.704673 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.716732 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.731700 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.747915 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.802134 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.802200 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.802213 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.802269 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.802288 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:53Z","lastTransitionTime":"2026-02-26T08:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.905970 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.906038 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.906051 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.906075 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.906095 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:53Z","lastTransitionTime":"2026-02-26T08:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.935982 4925 generic.go:334] "Generic (PLEG): container finished" podID="90c32fd9-f2da-4ba9-8cf9-703b6de14025" containerID="223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713" exitCode=0 Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.936067 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" event={"ID":"90c32fd9-f2da-4ba9-8cf9-703b6de14025","Type":"ContainerDied","Data":"223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713"} Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.951955 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.968700 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:53 crc kubenswrapper[4925]: I0226 08:45:53.987684 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.000493 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:53Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.012942 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.012992 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.013010 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.013033 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.013050 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:54Z","lastTransitionTime":"2026-02-26T08:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.018430 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:54Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.038476 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:54Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.052259 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:54Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.068007 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:54Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.081475 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:54Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.096327 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:54Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.119849 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5fbd478d6eacf67094af8fcf05a96aab9d5200a313d948397965b80658b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:54Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.120174 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.120218 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.120264 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.120285 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.120298 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:54Z","lastTransitionTime":"2026-02-26T08:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.133299 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:54Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.148956 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:54Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.173117 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:54Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.190229 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:54Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.223629 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.224235 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.224256 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.224274 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.224285 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:54Z","lastTransitionTime":"2026-02-26T08:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.326692 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.326735 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.326746 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.326765 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.326777 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:54Z","lastTransitionTime":"2026-02-26T08:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.429547 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.429621 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.429636 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.429658 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.429674 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:54Z","lastTransitionTime":"2026-02-26T08:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.531820 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.532028 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.532088 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.532146 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.532201 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:54Z","lastTransitionTime":"2026-02-26T08:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.634521 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.634958 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.635059 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.635143 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.635241 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:54Z","lastTransitionTime":"2026-02-26T08:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.739994 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.740055 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.740068 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.740091 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.740104 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:54Z","lastTransitionTime":"2026-02-26T08:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.842077 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.842115 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.842125 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.842140 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.842151 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:54Z","lastTransitionTime":"2026-02-26T08:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.942476 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" event={"ID":"90c32fd9-f2da-4ba9-8cf9-703b6de14025","Type":"ContainerStarted","Data":"e5524499eeebfb8bc504bd721b1d7d480ffabe222b1c69f03e9323f4825c0ba7"} Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.943860 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.943884 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.943891 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.943902 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.943911 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:54Z","lastTransitionTime":"2026-02-26T08:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.953676 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:54Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.969080 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5524499eeebfb8bc504bd721b1d7d480ffabe222b1c69f03e9323f4825c0ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:54Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.983991 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:54Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:54 crc kubenswrapper[4925]: I0226 08:45:54.998829 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:54Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.012240 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:55Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.025104 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:55Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.037397 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:55Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.045650 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.045689 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.045700 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.045716 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.045728 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:55Z","lastTransitionTime":"2026-02-26T08:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.050696 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:55Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.061744 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:55Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.071216 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:55Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.081515 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:55Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.093039 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:55Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.106976 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:55Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.127494 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5fbd478d6eacf67094af8fcf05a96aab9d5200a313d948397965b80658b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:55Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.139254 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:55Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.147711 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.147750 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.147761 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.147777 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.147789 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:55Z","lastTransitionTime":"2026-02-26T08:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.253385 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.253431 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.253442 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.253461 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.253474 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:55Z","lastTransitionTime":"2026-02-26T08:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.356353 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.356397 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.356408 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.356425 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.356439 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:55Z","lastTransitionTime":"2026-02-26T08:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.458897 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.458957 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.458970 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.458992 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.459007 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:55Z","lastTransitionTime":"2026-02-26T08:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.506284 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.506309 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.506339 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.506284 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:45:55 crc kubenswrapper[4925]: E0226 08:45:55.506414 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:45:55 crc kubenswrapper[4925]: E0226 08:45:55.506482 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:45:55 crc kubenswrapper[4925]: E0226 08:45:55.506592 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:45:55 crc kubenswrapper[4925]: E0226 08:45:55.506689 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.560845 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.560896 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.560908 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.560925 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.560938 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:55Z","lastTransitionTime":"2026-02-26T08:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.663948 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.664005 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.664022 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.664047 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.664063 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:55Z","lastTransitionTime":"2026-02-26T08:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.766404 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.766431 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.766438 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.766450 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.766458 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:55Z","lastTransitionTime":"2026-02-26T08:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.868651 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.868690 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.868701 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.868715 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.868726 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:55Z","lastTransitionTime":"2026-02-26T08:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.947265 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pv859_671c7a2c-75d0-4322-a581-3d1b26bda633/ovnkube-controller/0.log" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.949660 4925 generic.go:334] "Generic (PLEG): container finished" podID="671c7a2c-75d0-4322-a581-3d1b26bda633" containerID="f6b5fbd478d6eacf67094af8fcf05a96aab9d5200a313d948397965b80658b72" exitCode=1 Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.949694 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" event={"ID":"671c7a2c-75d0-4322-a581-3d1b26bda633","Type":"ContainerDied","Data":"f6b5fbd478d6eacf67094af8fcf05a96aab9d5200a313d948397965b80658b72"} Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.950336 4925 scope.go:117] "RemoveContainer" containerID="f6b5fbd478d6eacf67094af8fcf05a96aab9d5200a313d948397965b80658b72" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.965071 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:55Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.974858 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.974886 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.974895 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.974908 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.974927 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:55Z","lastTransitionTime":"2026-02-26T08:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.979478 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5524499eeebfb8bc504bd721b1d7d480ffabe222b1c69f03e9323f4825c0ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:55Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:55 crc kubenswrapper[4925]: I0226 08:45:55.998839 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:55Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.020417 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:56Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.034196 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:56Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.046360 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:56Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.056554 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:56Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.066890 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:56Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.079000 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.079056 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.079068 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.079088 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.079100 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:56Z","lastTransitionTime":"2026-02-26T08:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.082703 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:56Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.100073 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:56Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.111349 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:56Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.122359 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:56Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.137107 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:56Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.159437 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b5fbd478d6eacf67094af8fcf05a96aab9d5200a313d948397965b80658b72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b5fbd478d6eacf67094af8fcf05a96aab9d5200a313d948397965b80658b72\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:45:55Z\\\",\\\"message\\\":\\\"ding *v1.Pod event handler 6 for removal\\\\nI0226 08:45:55.317389 6516 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0226 08:45:55.317431 6516 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0226 08:45:55.317450 6516 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0226 08:45:55.317449 6516 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0226 08:45:55.317472 6516 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 08:45:55.317458 6516 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0226 08:45:55.317478 6516 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0226 08:45:55.317456 6516 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0226 08:45:55.317489 6516 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0226 08:45:55.317495 6516 handler.go:208] Removed *v1.Node event handler 7\\\\nI0226 08:45:55.317498 6516 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0226 08:45:55.317506 6516 handler.go:208] Removed *v1.Node event handler 2\\\\nI0226 08:45:55.317550 6516 factory.go:656] Stopping watch factory\\\\nI0226 08:45:55.317568 6516 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:56Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.171895 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:56Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.182367 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.182403 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.182414 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.182434 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.182444 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:56Z","lastTransitionTime":"2026-02-26T08:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.286342 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.286409 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.286427 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.286453 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.286471 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:56Z","lastTransitionTime":"2026-02-26T08:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.389859 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.389921 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.389931 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.390001 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.390015 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:56Z","lastTransitionTime":"2026-02-26T08:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.492937 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.493003 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.493012 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.493030 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.493041 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:56Z","lastTransitionTime":"2026-02-26T08:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.595086 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.595135 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.595146 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.595163 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.595175 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:56Z","lastTransitionTime":"2026-02-26T08:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.696783 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.696825 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.696834 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.696852 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.696862 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:56Z","lastTransitionTime":"2026-02-26T08:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.799329 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.799359 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.799367 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.799381 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.799393 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:56Z","lastTransitionTime":"2026-02-26T08:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.902018 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.902086 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.902098 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.902118 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.902131 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:56Z","lastTransitionTime":"2026-02-26T08:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.954417 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pv859_671c7a2c-75d0-4322-a581-3d1b26bda633/ovnkube-controller/0.log" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.957550 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" event={"ID":"671c7a2c-75d0-4322-a581-3d1b26bda633","Type":"ContainerStarted","Data":"829b25520b18cc8040a55486d3353b05ee58c563c07123b47d790bf9a490c581"} Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.958041 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.980626 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5524499eeebfb8bc504bd721b1d7d480ffabe222b1c69f03e9323f4825c0ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:56Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:56 crc kubenswrapper[4925]: I0226 08:45:56.996385 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:56Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.004971 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.005020 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.005032 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.005051 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.005065 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:57Z","lastTransitionTime":"2026-02-26T08:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.016995 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:57Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.034500 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:57Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.062818 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:57Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.087563 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:57Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.107492 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.107555 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.107568 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.107585 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.107596 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:57Z","lastTransitionTime":"2026-02-26T08:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.118990 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:57Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.138784 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:57Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.150044 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:57Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.160173 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:57Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.173676 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:57Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.187224 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:57Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.207420 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://829b25520b18cc8040a55486d3353b05ee58c563c07123b47d790bf9a490c581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b5fbd478d6eacf67094af8fcf05a96aab9d5200a313d948397965b80658b72\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:45:55Z\\\",\\\"message\\\":\\\"ding *v1.Pod event handler 6 for removal\\\\nI0226 08:45:55.317389 6516 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0226 08:45:55.317431 6516 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0226 08:45:55.317450 6516 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0226 08:45:55.317449 6516 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0226 08:45:55.317472 6516 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 08:45:55.317458 6516 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0226 08:45:55.317478 6516 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0226 08:45:55.317456 6516 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0226 08:45:55.317489 6516 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0226 08:45:55.317495 6516 handler.go:208] Removed *v1.Node event handler 7\\\\nI0226 08:45:55.317498 6516 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0226 08:45:55.317506 6516 handler.go:208] Removed *v1.Node event handler 2\\\\nI0226 08:45:55.317550 6516 factory.go:656] Stopping watch factory\\\\nI0226 08:45:55.317568 6516 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:57Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.210075 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.210112 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.210125 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.210143 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.210155 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:57Z","lastTransitionTime":"2026-02-26T08:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.218140 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:57Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.230237 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:57Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.312273 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.312314 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.312323 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.312339 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.312351 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:57Z","lastTransitionTime":"2026-02-26T08:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.414755 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.414793 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.414802 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.414817 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.414827 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:57Z","lastTransitionTime":"2026-02-26T08:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.452380 4925 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.506243 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.506297 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.506341 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:45:57 crc kubenswrapper[4925]: E0226 08:45:57.506354 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.506313 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:45:57 crc kubenswrapper[4925]: E0226 08:45:57.506446 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:45:57 crc kubenswrapper[4925]: E0226 08:45:57.506543 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:45:57 crc kubenswrapper[4925]: E0226 08:45:57.506637 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.516516 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.516578 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.516596 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.516619 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.516635 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:57Z","lastTransitionTime":"2026-02-26T08:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.619410 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.619452 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.619469 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.619494 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.619505 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:57Z","lastTransitionTime":"2026-02-26T08:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.732377 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.732459 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.732504 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.732578 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.732604 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:57Z","lastTransitionTime":"2026-02-26T08:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.846012 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.846063 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.846074 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.846088 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.846098 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:57Z","lastTransitionTime":"2026-02-26T08:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.949207 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.949262 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.949272 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.949288 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.949297 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:57Z","lastTransitionTime":"2026-02-26T08:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.962511 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pv859_671c7a2c-75d0-4322-a581-3d1b26bda633/ovnkube-controller/1.log" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.963048 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pv859_671c7a2c-75d0-4322-a581-3d1b26bda633/ovnkube-controller/0.log" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.966314 4925 generic.go:334] "Generic (PLEG): container finished" podID="671c7a2c-75d0-4322-a581-3d1b26bda633" containerID="829b25520b18cc8040a55486d3353b05ee58c563c07123b47d790bf9a490c581" exitCode=1 Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.966349 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" event={"ID":"671c7a2c-75d0-4322-a581-3d1b26bda633","Type":"ContainerDied","Data":"829b25520b18cc8040a55486d3353b05ee58c563c07123b47d790bf9a490c581"} Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.966395 4925 scope.go:117] "RemoveContainer" containerID="f6b5fbd478d6eacf67094af8fcf05a96aab9d5200a313d948397965b80658b72" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.967468 4925 scope.go:117] "RemoveContainer" containerID="829b25520b18cc8040a55486d3353b05ee58c563c07123b47d790bf9a490c581" Feb 26 08:45:57 crc kubenswrapper[4925]: E0226 08:45:57.967791 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pv859_openshift-ovn-kubernetes(671c7a2c-75d0-4322-a581-3d1b26bda633)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" podUID="671c7a2c-75d0-4322-a581-3d1b26bda633" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.983214 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:57Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:57 crc kubenswrapper[4925]: I0226 08:45:57.994158 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:57Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.007099 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5524499eeebfb8bc504bd721b1d7d480ffabe222b1c69f03e9323f4825c0ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.020606 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.032522 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.044781 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.051573 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.051617 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.051629 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.051646 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.051658 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:58Z","lastTransitionTime":"2026-02-26T08:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.055812 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.069287 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.079866 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.095265 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.104659 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.113771 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.123833 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.136984 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.153764 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.153793 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.153802 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.153817 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.153830 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:58Z","lastTransitionTime":"2026-02-26T08:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.155998 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://829b25520b18cc8040a55486d3353b05ee58c563c07123b47d790bf9a490c581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b5fbd478d6eacf67094af8fcf05a96aab9d5200a313d948397965b80658b72\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:45:55Z\\\",\\\"message\\\":\\\"ding *v1.Pod event handler 6 for removal\\\\nI0226 08:45:55.317389 6516 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0226 08:45:55.317431 6516 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0226 08:45:55.317450 6516 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0226 08:45:55.317449 6516 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0226 08:45:55.317472 6516 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 08:45:55.317458 6516 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0226 08:45:55.317478 6516 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0226 08:45:55.317456 6516 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0226 08:45:55.317489 6516 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0226 08:45:55.317495 6516 handler.go:208] Removed *v1.Node event handler 7\\\\nI0226 08:45:55.317498 6516 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0226 08:45:55.317506 6516 handler.go:208] Removed *v1.Node event handler 2\\\\nI0226 08:45:55.317550 6516 factory.go:656] Stopping watch factory\\\\nI0226 08:45:55.317568 6516 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://829b25520b18cc8040a55486d3353b05ee58c563c07123b47d790bf9a490c581\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:45:56Z\\\",\\\"message\\\":\\\"id: current time 2026-02-26T08:45:56Z is after 2025-08-24T17:21:41Z]\\\\nI0226 08:45:56.937231 6906 services_controller.go:434] Service openshift-marketplace/marketplace-operator-metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{marketplace-operator-metrics openshift-marketplace ee1b3a20-644f-4c69-a038-1d53fcace871 4537 0 2025-02-23 05:12:24 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[name:marketplace-operator] map[capability.openshift.io/name:marketplace include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:marketplace-operator-metrics service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00747db57 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:8383,TargetPort:{0 8383 },NodePort:0,AppProtocol:nil,},ServicePort{Name:https-metrics,Protocol:TCP,Port:8081,TargetPort:{0 8081 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{name: marketplace-operator,},Clu\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.257263 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.257428 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.257444 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.257496 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.257513 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:58Z","lastTransitionTime":"2026-02-26T08:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.360000 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.360032 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.360041 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.360055 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.360067 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:58Z","lastTransitionTime":"2026-02-26T08:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.462562 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.462606 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.462617 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.462634 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.462647 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:58Z","lastTransitionTime":"2026-02-26T08:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.518545 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.530510 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.541708 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.555002 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.564599 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.564642 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.564656 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.564672 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.564684 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:58Z","lastTransitionTime":"2026-02-26T08:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.566388 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.583915 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://829b25520b18cc8040a55486d3353b05ee58c563c07123b47d790bf9a490c581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b5fbd478d6eacf67094af8fcf05a96aab9d5200a313d948397965b80658b72\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:45:55Z\\\",\\\"message\\\":\\\"ding *v1.Pod event handler 6 for removal\\\\nI0226 08:45:55.317389 6516 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0226 08:45:55.317431 6516 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0226 08:45:55.317450 6516 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0226 08:45:55.317449 6516 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0226 08:45:55.317472 6516 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 08:45:55.317458 6516 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0226 08:45:55.317478 6516 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0226 08:45:55.317456 6516 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0226 08:45:55.317489 6516 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0226 08:45:55.317495 6516 handler.go:208] Removed *v1.Node event handler 7\\\\nI0226 08:45:55.317498 6516 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0226 08:45:55.317506 6516 handler.go:208] Removed *v1.Node event handler 2\\\\nI0226 08:45:55.317550 6516 factory.go:656] Stopping watch factory\\\\nI0226 08:45:55.317568 6516 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://829b25520b18cc8040a55486d3353b05ee58c563c07123b47d790bf9a490c581\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:45:56Z\\\",\\\"message\\\":\\\"id: current time 2026-02-26T08:45:56Z is after 2025-08-24T17:21:41Z]\\\\nI0226 08:45:56.937231 6906 services_controller.go:434] Service openshift-marketplace/marketplace-operator-metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{marketplace-operator-metrics openshift-marketplace ee1b3a20-644f-4c69-a038-1d53fcace871 4537 0 2025-02-23 05:12:24 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[name:marketplace-operator] map[capability.openshift.io/name:marketplace include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:marketplace-operator-metrics service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00747db57 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:8383,TargetPort:{0 8383 },NodePort:0,AppProtocol:nil,},ServicePort{Name:https-metrics,Protocol:TCP,Port:8081,TargetPort:{0 8081 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{name: marketplace-operator,},Clu\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.593379 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.602519 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.616541 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5524499eeebfb8bc504bd721b1d7d480ffabe222b1c69f03e9323f4825c0ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.630088 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.646605 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.659864 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.666603 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.666629 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.666640 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.666655 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.666665 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:58Z","lastTransitionTime":"2026-02-26T08:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.670130 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.679502 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.694595 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.770003 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.770039 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.770051 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.770068 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.770080 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:58Z","lastTransitionTime":"2026-02-26T08:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.872751 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.872817 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.872834 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.872871 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.872887 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:58Z","lastTransitionTime":"2026-02-26T08:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.972925 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pv859_671c7a2c-75d0-4322-a581-3d1b26bda633/ovnkube-controller/1.log" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.974898 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.974932 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.974945 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.974966 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.974981 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:58Z","lastTransitionTime":"2026-02-26T08:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.976856 4925 scope.go:117] "RemoveContainer" containerID="829b25520b18cc8040a55486d3353b05ee58c563c07123b47d790bf9a490c581" Feb 26 08:45:58 crc kubenswrapper[4925]: E0226 08:45:58.977035 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pv859_openshift-ovn-kubernetes(671c7a2c-75d0-4322-a581-3d1b26bda633)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" podUID="671c7a2c-75d0-4322-a581-3d1b26bda633" Feb 26 08:45:58 crc kubenswrapper[4925]: I0226 08:45:58.992602 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.005432 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.020654 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.032957 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.049501 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.062549 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.072329 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.072464 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.072575 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.072622 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.072711 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:59Z","lastTransitionTime":"2026-02-26T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.075175 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:59 crc kubenswrapper[4925]: E0226 08:45:59.088836 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.090574 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.093407 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.093450 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.093464 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.093481 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.093492 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:59Z","lastTransitionTime":"2026-02-26T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:59 crc kubenswrapper[4925]: E0226 08:45:59.111045 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.112079 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://829b25520b18cc8040a55486d3353b05ee58c563c07123b47d790bf9a490c581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://829b25520b18cc8040a55486d3353b05ee58c563c07123b47d790bf9a490c581\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:45:56Z\\\",\\\"message\\\":\\\"id: current time 2026-02-26T08:45:56Z is after 2025-08-24T17:21:41Z]\\\\nI0226 08:45:56.937231 6906 services_controller.go:434] Service openshift-marketplace/marketplace-operator-metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{marketplace-operator-metrics openshift-marketplace ee1b3a20-644f-4c69-a038-1d53fcace871 4537 0 2025-02-23 05:12:24 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[name:marketplace-operator] map[capability.openshift.io/name:marketplace include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:marketplace-operator-metrics service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00747db57 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:8383,TargetPort:{0 8383 },NodePort:0,AppProtocol:nil,},ServicePort{Name:https-metrics,Protocol:TCP,Port:8081,TargetPort:{0 8081 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{name: marketplace-operator,},Clu\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pv859_openshift-ovn-kubernetes(671c7a2c-75d0-4322-a581-3d1b26bda633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.114889 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.114953 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.114973 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.115000 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.115023 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:59Z","lastTransitionTime":"2026-02-26T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.120897 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:59 crc kubenswrapper[4925]: E0226 08:45:59.128840 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.131694 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.133173 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.133219 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.133231 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.133255 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.133270 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:59Z","lastTransitionTime":"2026-02-26T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.143036 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:59 crc kubenswrapper[4925]: E0226 08:45:59.145830 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.152420 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.152579 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.152669 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.152759 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.152832 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:59Z","lastTransitionTime":"2026-02-26T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.155618 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:59 crc kubenswrapper[4925]: E0226 08:45:59.167225 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:59 crc kubenswrapper[4925]: E0226 08:45:59.167480 4925 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.168131 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.169243 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.169332 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.169416 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.169504 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.169593 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:59Z","lastTransitionTime":"2026-02-26T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.187087 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5524499eeebfb8bc504bd721b1d7d480ffabe222b1c69f03e9323f4825c0ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.272552 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.272605 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.272622 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.272645 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.272661 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:59Z","lastTransitionTime":"2026-02-26T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.375785 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.376089 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.376217 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.376512 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.376738 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:59Z","lastTransitionTime":"2026-02-26T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.479482 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.479551 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.479562 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.479613 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.479630 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:59Z","lastTransitionTime":"2026-02-26T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.506713 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.506768 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.506791 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.506799 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:45:59 crc kubenswrapper[4925]: E0226 08:45:59.507284 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:45:59 crc kubenswrapper[4925]: E0226 08:45:59.507138 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:45:59 crc kubenswrapper[4925]: E0226 08:45:59.507349 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:45:59 crc kubenswrapper[4925]: E0226 08:45:59.507007 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.583579 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.584007 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.584189 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.584370 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.584561 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:59Z","lastTransitionTime":"2026-02-26T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.686992 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.687048 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.687060 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.687076 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.687086 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:59Z","lastTransitionTime":"2026-02-26T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.788840 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.788881 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.788891 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.788938 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.788953 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:59Z","lastTransitionTime":"2026-02-26T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.892480 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.892574 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.892587 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.892611 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.892625 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:59Z","lastTransitionTime":"2026-02-26T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.995716 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.995756 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.995768 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.995783 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:45:59 crc kubenswrapper[4925]: I0226 08:45:59.995794 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:45:59Z","lastTransitionTime":"2026-02-26T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.098575 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.098644 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.098682 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.098716 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.098738 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:00Z","lastTransitionTime":"2026-02-26T08:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.202079 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.202116 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.202128 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.202145 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.202159 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:00Z","lastTransitionTime":"2026-02-26T08:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.305565 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.305612 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.305623 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.305641 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.305652 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:00Z","lastTransitionTime":"2026-02-26T08:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.409755 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.409793 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.409802 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.409815 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.409824 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:00Z","lastTransitionTime":"2026-02-26T08:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.512255 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.512289 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.512306 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.512322 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.512333 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:00Z","lastTransitionTime":"2026-02-26T08:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.614429 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.614489 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.614505 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.614581 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.614604 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:00Z","lastTransitionTime":"2026-02-26T08:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.717723 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.717759 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.717785 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.717801 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.717813 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:00Z","lastTransitionTime":"2026-02-26T08:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.819885 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.819917 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.819926 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.819940 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.819954 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:00Z","lastTransitionTime":"2026-02-26T08:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.922809 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.922869 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.922886 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.922910 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:00 crc kubenswrapper[4925]: I0226 08:46:00.922927 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:00Z","lastTransitionTime":"2026-02-26T08:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.025941 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.026582 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.026645 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.026703 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.026762 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:01Z","lastTransitionTime":"2026-02-26T08:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.129035 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.129250 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.129308 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.129404 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.129461 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:01Z","lastTransitionTime":"2026-02-26T08:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.231563 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.231612 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.231659 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.231685 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.231702 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:01Z","lastTransitionTime":"2026-02-26T08:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.334561 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.334620 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.334642 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.334673 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.334695 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:01Z","lastTransitionTime":"2026-02-26T08:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.437597 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.437660 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.437679 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.437702 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.437718 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:01Z","lastTransitionTime":"2026-02-26T08:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.506965 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.507109 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.507320 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:01 crc kubenswrapper[4925]: E0226 08:46:01.507316 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.507350 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:01 crc kubenswrapper[4925]: E0226 08:46:01.507460 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:01 crc kubenswrapper[4925]: E0226 08:46:01.507574 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:01 crc kubenswrapper[4925]: E0226 08:46:01.507743 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.540631 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.540670 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.540681 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.540697 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.540708 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:01Z","lastTransitionTime":"2026-02-26T08:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.585932 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:46:01 crc kubenswrapper[4925]: E0226 08:46:01.586159 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:46:17.586128571 +0000 UTC m=+109.725702854 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.586228 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.586277 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.586319 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.586359 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs\") pod \"network-metrics-daemon-96njb\" (UID: \"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\") " pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.586413 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:01 crc kubenswrapper[4925]: E0226 08:46:01.586424 4925 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 08:46:01 crc kubenswrapper[4925]: E0226 08:46:01.586506 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 08:46:17.58648548 +0000 UTC m=+109.726059763 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 08:46:01 crc kubenswrapper[4925]: E0226 08:46:01.586544 4925 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 08:46:01 crc kubenswrapper[4925]: E0226 08:46:01.586563 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 08:46:01 crc kubenswrapper[4925]: E0226 08:46:01.586593 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 08:46:01 crc kubenswrapper[4925]: E0226 08:46:01.586601 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs podName:c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc nodeName:}" failed. No retries permitted until 2026-02-26 08:46:17.586584133 +0000 UTC m=+109.726158646 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs") pod "network-metrics-daemon-96njb" (UID: "c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 08:46:01 crc kubenswrapper[4925]: E0226 08:46:01.586589 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 08:46:01 crc kubenswrapper[4925]: E0226 08:46:01.586616 4925 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:46:01 crc kubenswrapper[4925]: E0226 08:46:01.586620 4925 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 08:46:01 crc kubenswrapper[4925]: E0226 08:46:01.586673 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 08:46:17.586654965 +0000 UTC m=+109.726229288 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:46:01 crc kubenswrapper[4925]: E0226 08:46:01.586713 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 08:46:17.586689626 +0000 UTC m=+109.726263909 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 08:46:01 crc kubenswrapper[4925]: E0226 08:46:01.586629 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 08:46:01 crc kubenswrapper[4925]: E0226 08:46:01.586739 4925 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:46:01 crc kubenswrapper[4925]: E0226 08:46:01.586773 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 08:46:17.586766128 +0000 UTC m=+109.726340411 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.643659 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.643722 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.643748 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.643779 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.643802 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:01Z","lastTransitionTime":"2026-02-26T08:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.746011 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.746070 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.746087 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.746111 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.746130 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:01Z","lastTransitionTime":"2026-02-26T08:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.848711 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.848769 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.848798 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.848825 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.848844 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:01Z","lastTransitionTime":"2026-02-26T08:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.952314 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.952359 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.952369 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.952384 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:01 crc kubenswrapper[4925]: I0226 08:46:01.952393 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:01Z","lastTransitionTime":"2026-02-26T08:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.055569 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.055611 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.055620 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.055635 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.055646 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:02Z","lastTransitionTime":"2026-02-26T08:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.158682 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.158744 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.158762 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.158786 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.158803 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:02Z","lastTransitionTime":"2026-02-26T08:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.262073 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.262117 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.262129 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.262144 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.262155 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:02Z","lastTransitionTime":"2026-02-26T08:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.365979 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.366067 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.366092 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.366117 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.366134 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:02Z","lastTransitionTime":"2026-02-26T08:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.469038 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.469103 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.469127 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.469159 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.469183 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:02Z","lastTransitionTime":"2026-02-26T08:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.572464 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.572517 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.572551 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.572573 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.572585 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:02Z","lastTransitionTime":"2026-02-26T08:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.675641 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.675698 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.675710 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.675733 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.675747 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:02Z","lastTransitionTime":"2026-02-26T08:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.778558 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.778612 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.778623 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.778644 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.778655 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:02Z","lastTransitionTime":"2026-02-26T08:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.881671 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.881742 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.881756 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.881779 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.881794 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:02Z","lastTransitionTime":"2026-02-26T08:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.984748 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.984830 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.984845 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.984868 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:02 crc kubenswrapper[4925]: I0226 08:46:02.984884 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:02Z","lastTransitionTime":"2026-02-26T08:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.087453 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.087504 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.087514 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.087552 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.087565 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:03Z","lastTransitionTime":"2026-02-26T08:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.189584 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.189623 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.189632 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.189648 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.189658 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:03Z","lastTransitionTime":"2026-02-26T08:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.292639 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.292701 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.292713 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.292735 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.292751 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:03Z","lastTransitionTime":"2026-02-26T08:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.395492 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.395558 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.395568 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.395589 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.395602 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:03Z","lastTransitionTime":"2026-02-26T08:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.544458 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.544609 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.544665 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.544692 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.544719 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.544791 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.544816 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:03 crc kubenswrapper[4925]: E0226 08:46:03.544813 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.544829 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:03Z","lastTransitionTime":"2026-02-26T08:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:03 crc kubenswrapper[4925]: E0226 08:46:03.544934 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.544843 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:03 crc kubenswrapper[4925]: E0226 08:46:03.545046 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:03 crc kubenswrapper[4925]: E0226 08:46:03.545155 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.647351 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.647398 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.647409 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.647426 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.647438 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:03Z","lastTransitionTime":"2026-02-26T08:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.750810 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.750880 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.750892 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.750914 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.750929 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:03Z","lastTransitionTime":"2026-02-26T08:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.853732 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.853833 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.853856 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.853884 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.853905 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:03Z","lastTransitionTime":"2026-02-26T08:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.956798 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.957272 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.957347 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.957458 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:03 crc kubenswrapper[4925]: I0226 08:46:03.957611 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:03Z","lastTransitionTime":"2026-02-26T08:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.060393 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.060458 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.060474 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.060498 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.060515 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:04Z","lastTransitionTime":"2026-02-26T08:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.164098 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.164182 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.164204 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.164230 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.164250 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:04Z","lastTransitionTime":"2026-02-26T08:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.266456 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.266502 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.266511 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.266547 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.266558 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:04Z","lastTransitionTime":"2026-02-26T08:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.369146 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.369187 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.369197 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.369217 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.369229 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:04Z","lastTransitionTime":"2026-02-26T08:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.472458 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.473100 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.473267 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.473382 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.473470 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:04Z","lastTransitionTime":"2026-02-26T08:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.507875 4925 scope.go:117] "RemoveContainer" containerID="450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011" Feb 26 08:46:04 crc kubenswrapper[4925]: E0226 08:46:04.508702 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.576933 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.576993 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.577007 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.577032 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.577051 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:04Z","lastTransitionTime":"2026-02-26T08:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.680968 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.681072 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.681109 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.681136 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.681149 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:04Z","lastTransitionTime":"2026-02-26T08:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.785060 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.785137 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.785155 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.785181 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.785209 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:04Z","lastTransitionTime":"2026-02-26T08:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.888675 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.888719 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.888729 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.888748 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.888759 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:04Z","lastTransitionTime":"2026-02-26T08:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.993091 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.993138 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.993148 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.993170 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:04 crc kubenswrapper[4925]: I0226 08:46:04.993182 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:04Z","lastTransitionTime":"2026-02-26T08:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.096690 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.096824 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.096850 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.096882 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.096906 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:05Z","lastTransitionTime":"2026-02-26T08:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.200165 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.200859 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.200967 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.201056 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.201140 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:05Z","lastTransitionTime":"2026-02-26T08:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.304110 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.304153 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.304166 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.304186 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.304199 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:05Z","lastTransitionTime":"2026-02-26T08:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.407000 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.407055 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.407068 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.407086 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.407101 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:05Z","lastTransitionTime":"2026-02-26T08:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.506162 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.506162 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.506189 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.506294 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:05 crc kubenswrapper[4925]: E0226 08:46:05.506424 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:05 crc kubenswrapper[4925]: E0226 08:46:05.506631 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:05 crc kubenswrapper[4925]: E0226 08:46:05.506780 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:05 crc kubenswrapper[4925]: E0226 08:46:05.506844 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.510570 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.510616 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.510626 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.510646 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.510655 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:05Z","lastTransitionTime":"2026-02-26T08:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.613682 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.613725 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.613740 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.613760 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.613775 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:05Z","lastTransitionTime":"2026-02-26T08:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.716906 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.716946 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.716956 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.716973 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.716984 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:05Z","lastTransitionTime":"2026-02-26T08:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.819785 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.819843 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.819856 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.819878 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.819892 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:05Z","lastTransitionTime":"2026-02-26T08:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.925159 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.925272 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.925287 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.925312 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:05 crc kubenswrapper[4925]: I0226 08:46:05.925353 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:05Z","lastTransitionTime":"2026-02-26T08:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.028892 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.028939 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.028960 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.028985 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.029004 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:06Z","lastTransitionTime":"2026-02-26T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.132237 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.132296 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.132319 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.132347 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.132366 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:06Z","lastTransitionTime":"2026-02-26T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.234605 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.234671 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.234687 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.234713 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.234731 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:06Z","lastTransitionTime":"2026-02-26T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.338112 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.338190 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.338209 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.338235 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.338252 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:06Z","lastTransitionTime":"2026-02-26T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.441064 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.441118 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.441135 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.441160 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.441177 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:06Z","lastTransitionTime":"2026-02-26T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.543924 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.543957 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.543966 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.543984 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.544000 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:06Z","lastTransitionTime":"2026-02-26T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.647810 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.647862 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.647877 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.647899 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.647916 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:06Z","lastTransitionTime":"2026-02-26T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.750724 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.750812 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.750823 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.750845 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.750859 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:06Z","lastTransitionTime":"2026-02-26T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.854240 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.854284 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.854294 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.854307 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.854335 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:06Z","lastTransitionTime":"2026-02-26T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.957695 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.957769 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.957786 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.957814 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:06 crc kubenswrapper[4925]: I0226 08:46:06.957832 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:06Z","lastTransitionTime":"2026-02-26T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.060885 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.060946 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.060963 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.060989 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.061008 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:07Z","lastTransitionTime":"2026-02-26T08:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.164448 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.164515 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.164569 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.164593 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.164610 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:07Z","lastTransitionTime":"2026-02-26T08:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.268793 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.268881 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.268910 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.268945 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.268967 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:07Z","lastTransitionTime":"2026-02-26T08:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.372691 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.372768 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.372785 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.372810 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.372828 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:07Z","lastTransitionTime":"2026-02-26T08:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.476285 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.476325 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.476334 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.476352 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.476362 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:07Z","lastTransitionTime":"2026-02-26T08:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.506777 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.506876 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:07 crc kubenswrapper[4925]: E0226 08:46:07.507107 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.507195 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:07 crc kubenswrapper[4925]: E0226 08:46:07.506892 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.507251 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:07 crc kubenswrapper[4925]: E0226 08:46:07.507338 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:07 crc kubenswrapper[4925]: E0226 08:46:07.507417 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.579575 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.579610 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.579625 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.579643 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.579654 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:07Z","lastTransitionTime":"2026-02-26T08:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.681891 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.681919 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.681927 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.681940 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.681948 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:07Z","lastTransitionTime":"2026-02-26T08:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.784437 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.784470 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.784483 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.784511 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.784555 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:07Z","lastTransitionTime":"2026-02-26T08:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.886876 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.886934 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.886952 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.886975 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.886993 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:07Z","lastTransitionTime":"2026-02-26T08:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.990065 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.990133 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.990157 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.990186 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:07 crc kubenswrapper[4925]: I0226 08:46:07.990207 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:07Z","lastTransitionTime":"2026-02-26T08:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.092856 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.092933 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.092956 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.092985 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.093008 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:08Z","lastTransitionTime":"2026-02-26T08:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.196468 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.196508 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.196521 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.196556 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.196568 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:08Z","lastTransitionTime":"2026-02-26T08:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.299628 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.299675 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.299686 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.299701 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.299711 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:08Z","lastTransitionTime":"2026-02-26T08:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.403447 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.403496 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.403513 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.403565 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.403588 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:08Z","lastTransitionTime":"2026-02-26T08:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.507281 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.507332 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.507349 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.507372 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.507391 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:08Z","lastTransitionTime":"2026-02-26T08:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.520303 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.534609 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5524499eeebfb8bc504bd721b1d7d480ffabe222b1c69f03e9323f4825c0ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.546402 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.558692 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.569063 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.580657 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.592614 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.604007 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.609305 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.609339 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.609349 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.609364 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.609375 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:08Z","lastTransitionTime":"2026-02-26T08:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.617274 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.636240 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://829b25520b18cc8040a55486d3353b05ee58c563c07123b47d790bf9a490c581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://829b25520b18cc8040a55486d3353b05ee58c563c07123b47d790bf9a490c581\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:45:56Z\\\",\\\"message\\\":\\\"id: current time 2026-02-26T08:45:56Z is after 2025-08-24T17:21:41Z]\\\\nI0226 08:45:56.937231 6906 services_controller.go:434] Service openshift-marketplace/marketplace-operator-metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{marketplace-operator-metrics openshift-marketplace ee1b3a20-644f-4c69-a038-1d53fcace871 4537 0 2025-02-23 05:12:24 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[name:marketplace-operator] map[capability.openshift.io/name:marketplace include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:marketplace-operator-metrics service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00747db57 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:8383,TargetPort:{0 8383 },NodePort:0,AppProtocol:nil,},ServicePort{Name:https-metrics,Protocol:TCP,Port:8081,TargetPort:{0 8081 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{name: marketplace-operator,},Clu\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pv859_openshift-ovn-kubernetes(671c7a2c-75d0-4322-a581-3d1b26bda633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.645411 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.656824 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.666480 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.677220 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.687825 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.712282 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.712317 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.712326 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.712340 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.712349 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:08Z","lastTransitionTime":"2026-02-26T08:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.814778 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.814828 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.814875 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.814892 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.814904 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:08Z","lastTransitionTime":"2026-02-26T08:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.918795 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.918847 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.918858 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.918875 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:08 crc kubenswrapper[4925]: I0226 08:46:08.918886 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:08Z","lastTransitionTime":"2026-02-26T08:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.021104 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.021141 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.021179 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.021193 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.021203 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:09Z","lastTransitionTime":"2026-02-26T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.123829 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.123899 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.123922 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.123949 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.123972 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:09Z","lastTransitionTime":"2026-02-26T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.227733 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.227802 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.227819 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.227842 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.227859 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:09Z","lastTransitionTime":"2026-02-26T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.330494 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.330553 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.330563 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.330595 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.330608 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:09Z","lastTransitionTime":"2026-02-26T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.433767 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.433814 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.433826 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.433842 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.433854 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:09Z","lastTransitionTime":"2026-02-26T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.506248 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.506367 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:09 crc kubenswrapper[4925]: E0226 08:46:09.506426 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:09 crc kubenswrapper[4925]: E0226 08:46:09.506502 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.506608 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.506616 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:09 crc kubenswrapper[4925]: E0226 08:46:09.506666 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:09 crc kubenswrapper[4925]: E0226 08:46:09.506726 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.536046 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.536082 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.536091 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.536103 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.536133 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:09Z","lastTransitionTime":"2026-02-26T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.549119 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.549178 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.549195 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.549219 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.549235 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:09Z","lastTransitionTime":"2026-02-26T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:09 crc kubenswrapper[4925]: E0226 08:46:09.565632 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:09Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.569443 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.569472 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.569482 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.569495 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.569504 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:09Z","lastTransitionTime":"2026-02-26T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:09 crc kubenswrapper[4925]: E0226 08:46:09.582402 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:09Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.586006 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.586026 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.586033 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.586043 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.586050 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:09Z","lastTransitionTime":"2026-02-26T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:09 crc kubenswrapper[4925]: E0226 08:46:09.599113 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:09Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.603566 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.603599 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.603607 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.603621 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.603630 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:09Z","lastTransitionTime":"2026-02-26T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:09 crc kubenswrapper[4925]: E0226 08:46:09.617099 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:09Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.620163 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.620195 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.620206 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.620221 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.620233 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:09Z","lastTransitionTime":"2026-02-26T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:09 crc kubenswrapper[4925]: E0226 08:46:09.629905 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:09Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:09 crc kubenswrapper[4925]: E0226 08:46:09.630083 4925 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.638116 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.638160 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.638169 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.638182 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.638191 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:09Z","lastTransitionTime":"2026-02-26T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.741013 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.741081 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.741104 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.741132 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.741154 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:09Z","lastTransitionTime":"2026-02-26T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.847964 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.848010 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.848161 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.848224 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.848237 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:09Z","lastTransitionTime":"2026-02-26T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.950689 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.950741 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.950753 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.950771 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:09 crc kubenswrapper[4925]: I0226 08:46:09.950788 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:09Z","lastTransitionTime":"2026-02-26T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.053932 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.053971 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.053983 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.053999 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.054010 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:10Z","lastTransitionTime":"2026-02-26T08:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.156728 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.156767 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.156777 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.156792 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.156806 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:10Z","lastTransitionTime":"2026-02-26T08:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.259315 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.259394 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.259418 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.259446 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.259463 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:10Z","lastTransitionTime":"2026-02-26T08:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.362612 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.362711 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.362729 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.362752 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.362768 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:10Z","lastTransitionTime":"2026-02-26T08:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.465750 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.465816 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.465828 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.465847 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.465861 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:10Z","lastTransitionTime":"2026-02-26T08:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.508299 4925 scope.go:117] "RemoveContainer" containerID="829b25520b18cc8040a55486d3353b05ee58c563c07123b47d790bf9a490c581" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.568473 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.568540 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.568552 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.568571 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.568583 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:10Z","lastTransitionTime":"2026-02-26T08:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.671722 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.671765 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.671776 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.671794 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.671806 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:10Z","lastTransitionTime":"2026-02-26T08:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.775043 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.775097 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.775107 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.775127 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.775141 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:10Z","lastTransitionTime":"2026-02-26T08:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.878294 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.878335 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.878344 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.878359 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.878369 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:10Z","lastTransitionTime":"2026-02-26T08:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.982100 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.982158 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.982168 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.982192 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:10 crc kubenswrapper[4925]: I0226 08:46:10.982205 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:10Z","lastTransitionTime":"2026-02-26T08:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.017115 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pv859_671c7a2c-75d0-4322-a581-3d1b26bda633/ovnkube-controller/1.log" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.020491 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" event={"ID":"671c7a2c-75d0-4322-a581-3d1b26bda633","Type":"ContainerStarted","Data":"efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4"} Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.020964 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.035634 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:11Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.051220 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:11Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.072152 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:11Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.084433 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.084484 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.084497 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.084519 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.084549 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:11Z","lastTransitionTime":"2026-02-26T08:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.088697 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:11Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.114643 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://829b25520b18cc8040a55486d3353b05ee58c563c07123b47d790bf9a490c581\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:45:56Z\\\",\\\"message\\\":\\\"id: current time 2026-02-26T08:45:56Z is after 2025-08-24T17:21:41Z]\\\\nI0226 08:45:56.937231 6906 services_controller.go:434] Service openshift-marketplace/marketplace-operator-metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{marketplace-operator-metrics openshift-marketplace ee1b3a20-644f-4c69-a038-1d53fcace871 4537 0 2025-02-23 05:12:24 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[name:marketplace-operator] map[capability.openshift.io/name:marketplace include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:marketplace-operator-metrics service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00747db57 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:8383,TargetPort:{0 8383 },NodePort:0,AppProtocol:nil,},ServicePort{Name:https-metrics,Protocol:TCP,Port:8081,TargetPort:{0 8081 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{name: marketplace-operator,},Clu\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:11Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.130953 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:11Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.155438 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:11Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.185261 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:11Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.187062 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.187098 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.187109 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.187128 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.187143 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:11Z","lastTransitionTime":"2026-02-26T08:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.220537 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:11Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.236320 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5524499eeebfb8bc504bd721b1d7d480ffabe222b1c69f03e9323f4825c0ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:11Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.264377 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:11Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.276637 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:11Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.290932 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.290971 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.290982 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.291000 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.291012 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:11Z","lastTransitionTime":"2026-02-26T08:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.291305 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:11Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.308582 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:11Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.327436 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:11Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.394071 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.394115 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.394126 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.394144 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.394157 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:11Z","lastTransitionTime":"2026-02-26T08:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.497658 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.497701 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.497711 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.497727 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.497738 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:11Z","lastTransitionTime":"2026-02-26T08:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.506105 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:11 crc kubenswrapper[4925]: E0226 08:46:11.506270 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.506304 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.506284 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.506340 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:11 crc kubenswrapper[4925]: E0226 08:46:11.506871 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:11 crc kubenswrapper[4925]: E0226 08:46:11.506950 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:11 crc kubenswrapper[4925]: E0226 08:46:11.507030 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.523419 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.600498 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.600558 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.600569 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.600590 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.600606 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:11Z","lastTransitionTime":"2026-02-26T08:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.703000 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.703069 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.703107 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.703138 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.703161 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:11Z","lastTransitionTime":"2026-02-26T08:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.806236 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.806300 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.806310 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.806333 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.806349 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:11Z","lastTransitionTime":"2026-02-26T08:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.909902 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.909988 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.910010 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.910039 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:11 crc kubenswrapper[4925]: I0226 08:46:11.910063 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:11Z","lastTransitionTime":"2026-02-26T08:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.013121 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.013196 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.013213 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.013241 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.013260 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:12Z","lastTransitionTime":"2026-02-26T08:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.028316 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pv859_671c7a2c-75d0-4322-a581-3d1b26bda633/ovnkube-controller/2.log" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.029467 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pv859_671c7a2c-75d0-4322-a581-3d1b26bda633/ovnkube-controller/1.log" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.034208 4925 generic.go:334] "Generic (PLEG): container finished" podID="671c7a2c-75d0-4322-a581-3d1b26bda633" containerID="efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4" exitCode=1 Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.034317 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" event={"ID":"671c7a2c-75d0-4322-a581-3d1b26bda633","Type":"ContainerDied","Data":"efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4"} Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.034407 4925 scope.go:117] "RemoveContainer" containerID="829b25520b18cc8040a55486d3353b05ee58c563c07123b47d790bf9a490c581" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.035799 4925 scope.go:117] "RemoveContainer" containerID="efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4" Feb 26 08:46:12 crc kubenswrapper[4925]: E0226 08:46:12.036465 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pv859_openshift-ovn-kubernetes(671c7a2c-75d0-4322-a581-3d1b26bda633)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" podUID="671c7a2c-75d0-4322-a581-3d1b26bda633" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.050823 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:12Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.068121 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5524499eeebfb8bc504bd721b1d7d480ffabe222b1c69f03e9323f4825c0ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:12Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.086868 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:12Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.102066 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:12Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.116759 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:12Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.116970 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.117017 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.117035 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.117060 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.117076 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:12Z","lastTransitionTime":"2026-02-26T08:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.132025 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:12Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.150041 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:12Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.165017 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:12Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.179236 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:12Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.196179 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:12Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.218831 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://829b25520b18cc8040a55486d3353b05ee58c563c07123b47d790bf9a490c581\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:45:56Z\\\",\\\"message\\\":\\\"id: current time 2026-02-26T08:45:56Z is after 2025-08-24T17:21:41Z]\\\\nI0226 08:45:56.937231 6906 services_controller.go:434] Service openshift-marketplace/marketplace-operator-metrics retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{marketplace-operator-metrics openshift-marketplace ee1b3a20-644f-4c69-a038-1d53fcace871 4537 0 2025-02-23 05:12:24 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[name:marketplace-operator] map[capability.openshift.io/name:marketplace include.release.openshift.io/hypershift:true include.release.openshift.io/ibm-cloud-managed:true include.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:marketplace-operator-metrics service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc00747db57 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:8383,TargetPort:{0 8383 },NodePort:0,AppProtocol:nil,},ServicePort{Name:https-metrics,Protocol:TCP,Port:8081,TargetPort:{0 8081 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{name: marketplace-operator,},Clu\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:46:11Z\\\",\\\"message\\\":\\\" 7057 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0226 08:46:11.381663 7057 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0226 08:46:11.381712 7057 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0226 08:46:11.381755 7057 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0226 08:46:11.381786 7057 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0226 08:46:11.381810 7057 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0226 08:46:11.381860 7057 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0226 08:46:11.381915 7057 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0226 08:46:11.381936 7057 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 08:46:11.381948 7057 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0226 08:46:11.381921 7057 handler.go:208] Removed *v1.Node event handler 2\\\\nI0226 08:46:11.381964 7057 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0226 08:46:11.381968 7057 handler.go:208] Removed *v1.Node event handler 7\\\\nI0226 08:46:11.381927 7057 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0226 08:46:11.382007 7057 factory.go:656] Stopping watch factory\\\\nI0226 08:46:11.382043 7057 ovnkube.go:599] Stopped ovnkube\\\\nI0226 08:46:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:12Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.219979 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.220014 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.220025 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.220048 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.220061 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:12Z","lastTransitionTime":"2026-02-26T08:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.233827 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:12Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.251881 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:12Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.265521 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:12Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.298135 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9bb4d7-d949-4ead-b4d5-4a603104c6d2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11597156d0bd5fe891962baaac60089238f8ef371feffd647a62af174bb6dfbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aec7efa5468d3890f22dadb3185aa046a9de96f08c80d8a3ecaa0ea21c404dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5736abc0c618dc7ce8a2d18bba180888ba30d10c8648ba3d99309564717819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f29be76c16486a46b39eabf82b3cf56b4b9060dd7893da06a3f1de3ab7728a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c22a47f6f43936c0e85d447fd31a331cd28cbf20cafaa5111f576b7b9445d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:12Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.317310 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:12Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.325488 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.325550 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.325561 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.325578 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.325592 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:12Z","lastTransitionTime":"2026-02-26T08:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.428454 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.428575 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.428611 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.428637 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.428655 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:12Z","lastTransitionTime":"2026-02-26T08:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.532278 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.532352 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.532374 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.532405 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.532431 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:12Z","lastTransitionTime":"2026-02-26T08:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.635958 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.636036 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.636060 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.636089 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.636111 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:12Z","lastTransitionTime":"2026-02-26T08:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.739140 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.739207 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.739234 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.739304 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.739332 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:12Z","lastTransitionTime":"2026-02-26T08:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.842141 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.842197 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.842207 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.842224 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.842235 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:12Z","lastTransitionTime":"2026-02-26T08:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.945164 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.945208 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.945222 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.945237 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:12 crc kubenswrapper[4925]: I0226 08:46:12.945248 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:12Z","lastTransitionTime":"2026-02-26T08:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.039701 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pv859_671c7a2c-75d0-4322-a581-3d1b26bda633/ovnkube-controller/2.log" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.043457 4925 scope.go:117] "RemoveContainer" containerID="efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4" Feb 26 08:46:13 crc kubenswrapper[4925]: E0226 08:46:13.043651 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pv859_openshift-ovn-kubernetes(671c7a2c-75d0-4322-a581-3d1b26bda633)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" podUID="671c7a2c-75d0-4322-a581-3d1b26bda633" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.048868 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.048940 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.048960 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.048987 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.049003 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:13Z","lastTransitionTime":"2026-02-26T08:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.058978 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:13Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.074036 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:13Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.085829 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:13Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.097017 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:13Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.109340 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:13Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.121967 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:13Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.142090 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:46:11Z\\\",\\\"message\\\":\\\" 7057 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0226 08:46:11.381663 7057 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0226 08:46:11.381712 7057 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0226 08:46:11.381755 7057 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0226 08:46:11.381786 7057 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0226 08:46:11.381810 7057 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0226 08:46:11.381860 7057 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0226 08:46:11.381915 7057 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0226 08:46:11.381936 7057 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 08:46:11.381948 7057 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0226 08:46:11.381921 7057 handler.go:208] Removed *v1.Node event handler 2\\\\nI0226 08:46:11.381964 7057 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0226 08:46:11.381968 7057 handler.go:208] Removed *v1.Node event handler 7\\\\nI0226 08:46:11.381927 7057 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0226 08:46:11.382007 7057 factory.go:656] Stopping watch factory\\\\nI0226 08:46:11.382043 7057 ovnkube.go:599] Stopped ovnkube\\\\nI0226 08:46:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:46:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pv859_openshift-ovn-kubernetes(671c7a2c-75d0-4322-a581-3d1b26bda633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:13Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.151763 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.151809 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.151820 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.151839 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.151851 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:13Z","lastTransitionTime":"2026-02-26T08:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.165519 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9bb4d7-d949-4ead-b4d5-4a603104c6d2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11597156d0bd5fe891962baaac60089238f8ef371feffd647a62af174bb6dfbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aec7efa5468d3890f22dadb3185aa046a9de96f08c80d8a3ecaa0ea21c404dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5736abc0c618dc7ce8a2d18bba180888ba30d10c8648ba3d99309564717819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f29be76c16486a46b39eabf82b3cf56b4b9060dd7893da06a3f1de3ab7728a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c22a47f6f43936c0e85d447fd31a331cd28cbf20cafaa5111f576b7b9445d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:13Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.180511 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:13Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.195563 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:13Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.209302 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5524499eeebfb8bc504bd721b1d7d480ffabe222b1c69f03e9323f4825c0ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:13Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.222988 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:13Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.238421 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:13Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.253338 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:13Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.254684 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.254800 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.254895 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.254961 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.255029 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:13Z","lastTransitionTime":"2026-02-26T08:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.267740 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:13Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.279898 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:13Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.357902 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.358004 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.358023 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.358046 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.358065 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:13Z","lastTransitionTime":"2026-02-26T08:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.460973 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.461004 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.461012 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.461025 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.461034 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:13Z","lastTransitionTime":"2026-02-26T08:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.506690 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.506716 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:13 crc kubenswrapper[4925]: E0226 08:46:13.506812 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.507148 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:13 crc kubenswrapper[4925]: E0226 08:46:13.507208 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.507235 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:13 crc kubenswrapper[4925]: E0226 08:46:13.507275 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:13 crc kubenswrapper[4925]: E0226 08:46:13.507331 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.564135 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.564170 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.564179 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.564193 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.564204 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:13Z","lastTransitionTime":"2026-02-26T08:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.667471 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.667574 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.667598 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.667628 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.667650 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:13Z","lastTransitionTime":"2026-02-26T08:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.770283 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.770333 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.770380 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.770401 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.770413 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:13Z","lastTransitionTime":"2026-02-26T08:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.873474 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.873518 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.873558 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.873641 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.873732 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:13Z","lastTransitionTime":"2026-02-26T08:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.976372 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.976405 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.976413 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.976425 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:13 crc kubenswrapper[4925]: I0226 08:46:13.976435 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:13Z","lastTransitionTime":"2026-02-26T08:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.079170 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.079220 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.079231 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.079248 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.079260 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:14Z","lastTransitionTime":"2026-02-26T08:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.181364 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.181413 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.181423 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.181441 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.181452 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:14Z","lastTransitionTime":"2026-02-26T08:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.284212 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.284276 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.284299 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.284329 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.284351 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:14Z","lastTransitionTime":"2026-02-26T08:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.386874 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.386924 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.386935 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.386950 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.386962 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:14Z","lastTransitionTime":"2026-02-26T08:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.490573 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.490641 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.490665 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.490695 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.490717 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:14Z","lastTransitionTime":"2026-02-26T08:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.594013 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.594089 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.594110 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.594139 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.594158 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:14Z","lastTransitionTime":"2026-02-26T08:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.696741 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.696792 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.696800 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.696816 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.696826 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:14Z","lastTransitionTime":"2026-02-26T08:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.800056 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.800100 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.800112 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.800129 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.800141 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:14Z","lastTransitionTime":"2026-02-26T08:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.902127 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.902167 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.902178 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.902192 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:14 crc kubenswrapper[4925]: I0226 08:46:14.902203 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:14Z","lastTransitionTime":"2026-02-26T08:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.005165 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.005221 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.005237 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.005260 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.005278 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:15Z","lastTransitionTime":"2026-02-26T08:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.107340 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.107377 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.107386 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.107399 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.107409 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:15Z","lastTransitionTime":"2026-02-26T08:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.210116 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.210171 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.210190 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.210213 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.210230 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:15Z","lastTransitionTime":"2026-02-26T08:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.312860 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.312909 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.312924 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.312944 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.312957 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:15Z","lastTransitionTime":"2026-02-26T08:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.415188 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.415227 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.415238 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.415254 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.415265 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:15Z","lastTransitionTime":"2026-02-26T08:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.507043 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.507095 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.507095 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.507065 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:15 crc kubenswrapper[4925]: E0226 08:46:15.507275 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:15 crc kubenswrapper[4925]: E0226 08:46:15.507374 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:15 crc kubenswrapper[4925]: E0226 08:46:15.507494 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:15 crc kubenswrapper[4925]: E0226 08:46:15.507584 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.517944 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.517980 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.517990 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.518005 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.518017 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:15Z","lastTransitionTime":"2026-02-26T08:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.620677 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.620725 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.620736 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.620754 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.620765 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:15Z","lastTransitionTime":"2026-02-26T08:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.723684 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.723727 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.723736 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.723752 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.723765 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:15Z","lastTransitionTime":"2026-02-26T08:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.826435 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.826483 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.826496 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.826511 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.826539 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:15Z","lastTransitionTime":"2026-02-26T08:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.929781 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.930080 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.930088 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.930102 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:15 crc kubenswrapper[4925]: I0226 08:46:15.930112 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:15Z","lastTransitionTime":"2026-02-26T08:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.033256 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.033860 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.033894 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.033916 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.033928 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:16Z","lastTransitionTime":"2026-02-26T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.136554 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.136600 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.136609 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.136623 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.136633 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:16Z","lastTransitionTime":"2026-02-26T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.239226 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.239294 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.239321 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.239343 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.239358 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:16Z","lastTransitionTime":"2026-02-26T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.342980 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.343039 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.343055 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.343078 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.343335 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:16Z","lastTransitionTime":"2026-02-26T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.446114 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.446167 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.446180 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.446196 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.446208 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:16Z","lastTransitionTime":"2026-02-26T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.549205 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.549250 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.549264 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.549286 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.549301 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:16Z","lastTransitionTime":"2026-02-26T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.652101 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.652176 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.652190 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.652215 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.652263 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:16Z","lastTransitionTime":"2026-02-26T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.757417 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.757485 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.757498 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.757522 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.757557 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:16Z","lastTransitionTime":"2026-02-26T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.860686 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.860742 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.860753 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.860772 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.860784 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:16Z","lastTransitionTime":"2026-02-26T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.965142 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.965197 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.965211 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.965227 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:16 crc kubenswrapper[4925]: I0226 08:46:16.965238 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:16Z","lastTransitionTime":"2026-02-26T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.067517 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.067828 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.067909 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.067990 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.068068 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:17Z","lastTransitionTime":"2026-02-26T08:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.175901 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.175958 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.175971 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.175993 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.176009 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:17Z","lastTransitionTime":"2026-02-26T08:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.278958 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.279423 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.279617 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.279760 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.279850 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:17Z","lastTransitionTime":"2026-02-26T08:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.383550 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.383590 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.383600 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.383617 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.383626 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:17Z","lastTransitionTime":"2026-02-26T08:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.486051 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.486101 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.486116 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.486134 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.486145 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:17Z","lastTransitionTime":"2026-02-26T08:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.506999 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:17 crc kubenswrapper[4925]: E0226 08:46:17.507126 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.507188 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.507188 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:17 crc kubenswrapper[4925]: E0226 08:46:17.507384 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:17 crc kubenswrapper[4925]: E0226 08:46:17.507419 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.507214 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:17 crc kubenswrapper[4925]: E0226 08:46:17.507499 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.588818 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.588892 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.588908 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.588935 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.588949 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:17Z","lastTransitionTime":"2026-02-26T08:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.673807 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.673936 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:17 crc kubenswrapper[4925]: E0226 08:46:17.674025 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:46:49.673984227 +0000 UTC m=+141.813558510 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:46:17 crc kubenswrapper[4925]: E0226 08:46:17.674076 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 08:46:17 crc kubenswrapper[4925]: E0226 08:46:17.674101 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 08:46:17 crc kubenswrapper[4925]: E0226 08:46:17.674113 4925 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:46:17 crc kubenswrapper[4925]: E0226 08:46:17.674174 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 08:46:49.674156791 +0000 UTC m=+141.813731284 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.674167 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs\") pod \"network-metrics-daemon-96njb\" (UID: \"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\") " pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.674227 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.674297 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.674334 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:17 crc kubenswrapper[4925]: E0226 08:46:17.674377 4925 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 08:46:17 crc kubenswrapper[4925]: E0226 08:46:17.674465 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs podName:c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc nodeName:}" failed. No retries permitted until 2026-02-26 08:46:49.674437449 +0000 UTC m=+141.814011902 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs") pod "network-metrics-daemon-96njb" (UID: "c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 08:46:17 crc kubenswrapper[4925]: E0226 08:46:17.674386 4925 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 08:46:17 crc kubenswrapper[4925]: E0226 08:46:17.674516 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 08:46:17 crc kubenswrapper[4925]: E0226 08:46:17.674551 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 08:46:17 crc kubenswrapper[4925]: E0226 08:46:17.674572 4925 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:46:17 crc kubenswrapper[4925]: E0226 08:46:17.674579 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 08:46:49.674559952 +0000 UTC m=+141.814134265 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 08:46:17 crc kubenswrapper[4925]: E0226 08:46:17.674589 4925 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 08:46:17 crc kubenswrapper[4925]: E0226 08:46:17.674616 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 08:46:49.674607693 +0000 UTC m=+141.814181976 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:46:17 crc kubenswrapper[4925]: E0226 08:46:17.674715 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 08:46:49.674693116 +0000 UTC m=+141.814267589 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.692012 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.692058 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.692072 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.692090 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.692102 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:17Z","lastTransitionTime":"2026-02-26T08:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.795102 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.795165 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.795174 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.795194 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.795205 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:17Z","lastTransitionTime":"2026-02-26T08:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.898276 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.898344 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.898367 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.898398 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:17 crc kubenswrapper[4925]: I0226 08:46:17.898423 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:17Z","lastTransitionTime":"2026-02-26T08:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.001975 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.002029 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.002043 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.002060 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.002073 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:18Z","lastTransitionTime":"2026-02-26T08:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.104788 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.104849 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.104867 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.104893 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.104910 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:18Z","lastTransitionTime":"2026-02-26T08:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.206982 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.207059 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.207085 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.207115 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.207137 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:18Z","lastTransitionTime":"2026-02-26T08:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.310488 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.310610 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.310638 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.310670 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.310696 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:18Z","lastTransitionTime":"2026-02-26T08:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.413913 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.413989 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.414013 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.414063 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.414091 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:18Z","lastTransitionTime":"2026-02-26T08:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.508929 4925 scope.go:117] "RemoveContainer" containerID="450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011" Feb 26 08:46:18 crc kubenswrapper[4925]: E0226 08:46:18.509799 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.516121 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.516210 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.516235 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.516266 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.516289 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:18Z","lastTransitionTime":"2026-02-26T08:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.525414 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.547035 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5524499eeebfb8bc504bd721b1d7d480ffabe222b1c69f03e9323f4825c0ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.564794 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.582470 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.596826 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.609034 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.617995 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.618040 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.618055 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.618076 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.618092 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:18Z","lastTransitionTime":"2026-02-26T08:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.632204 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.645692 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.658382 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.670860 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.687582 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.706718 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:46:11Z\\\",\\\"message\\\":\\\" 7057 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0226 08:46:11.381663 7057 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0226 08:46:11.381712 7057 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0226 08:46:11.381755 7057 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0226 08:46:11.381786 7057 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0226 08:46:11.381810 7057 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0226 08:46:11.381860 7057 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0226 08:46:11.381915 7057 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0226 08:46:11.381936 7057 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 08:46:11.381948 7057 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0226 08:46:11.381921 7057 handler.go:208] Removed *v1.Node event handler 2\\\\nI0226 08:46:11.381964 7057 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0226 08:46:11.381968 7057 handler.go:208] Removed *v1.Node event handler 7\\\\nI0226 08:46:11.381927 7057 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0226 08:46:11.382007 7057 factory.go:656] Stopping watch factory\\\\nI0226 08:46:11.382043 7057 ovnkube.go:599] Stopped ovnkube\\\\nI0226 08:46:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:46:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pv859_openshift-ovn-kubernetes(671c7a2c-75d0-4322-a581-3d1b26bda633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.716326 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.720246 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.720285 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.720293 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.720306 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.720315 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:18Z","lastTransitionTime":"2026-02-26T08:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.727688 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.739649 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.768791 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9bb4d7-d949-4ead-b4d5-4a603104c6d2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11597156d0bd5fe891962baaac60089238f8ef371feffd647a62af174bb6dfbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aec7efa5468d3890f22dadb3185aa046a9de96f08c80d8a3ecaa0ea21c404dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5736abc0c618dc7ce8a2d18bba180888ba30d10c8648ba3d99309564717819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f29be76c16486a46b39eabf82b3cf56b4b9060dd7893da06a3f1de3ab7728a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c22a47f6f43936c0e85d447fd31a331cd28cbf20cafaa5111f576b7b9445d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.822723 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.822792 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.822810 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.822834 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.822851 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:18Z","lastTransitionTime":"2026-02-26T08:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.925149 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.925231 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.925255 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.925286 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:18 crc kubenswrapper[4925]: I0226 08:46:18.925309 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:18Z","lastTransitionTime":"2026-02-26T08:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.028943 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.028995 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.029005 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.029021 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.029032 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:19Z","lastTransitionTime":"2026-02-26T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.133048 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.133097 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.133108 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.133125 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.133138 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:19Z","lastTransitionTime":"2026-02-26T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.235404 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.235444 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.235455 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.235471 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.235480 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:19Z","lastTransitionTime":"2026-02-26T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.338195 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.338242 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.338250 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.338263 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.338271 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:19Z","lastTransitionTime":"2026-02-26T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.441052 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.441113 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.441130 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.441152 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.441167 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:19Z","lastTransitionTime":"2026-02-26T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.506063 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.506161 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:19 crc kubenswrapper[4925]: E0226 08:46:19.506200 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.506255 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:19 crc kubenswrapper[4925]: E0226 08:46:19.506326 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:19 crc kubenswrapper[4925]: E0226 08:46:19.506463 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.506486 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:19 crc kubenswrapper[4925]: E0226 08:46:19.506710 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.544312 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.544349 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.544358 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.544373 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.544385 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:19Z","lastTransitionTime":"2026-02-26T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.646285 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.646328 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.646336 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.646355 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.646365 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:19Z","lastTransitionTime":"2026-02-26T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.749039 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.749091 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.749109 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.749134 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.749153 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:19Z","lastTransitionTime":"2026-02-26T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.831173 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.831244 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.831266 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.831315 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.831341 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:19Z","lastTransitionTime":"2026-02-26T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:19 crc kubenswrapper[4925]: E0226 08:46:19.848977 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:19Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.854716 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.854782 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.854805 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.854833 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.854854 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:19Z","lastTransitionTime":"2026-02-26T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:19 crc kubenswrapper[4925]: E0226 08:46:19.875002 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:19Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.879913 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.880000 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.880026 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.880057 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.880079 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:19Z","lastTransitionTime":"2026-02-26T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:19 crc kubenswrapper[4925]: E0226 08:46:19.901141 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:19Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.908343 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.908407 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.908417 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.908431 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.908440 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:19Z","lastTransitionTime":"2026-02-26T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:19 crc kubenswrapper[4925]: E0226 08:46:19.920584 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:19Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.925352 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.925388 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.925397 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.925415 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.925426 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:19Z","lastTransitionTime":"2026-02-26T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:19 crc kubenswrapper[4925]: E0226 08:46:19.937506 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:19Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:19 crc kubenswrapper[4925]: E0226 08:46:19.937678 4925 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.939172 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.939201 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.939217 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.939235 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:19 crc kubenswrapper[4925]: I0226 08:46:19.939246 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:19Z","lastTransitionTime":"2026-02-26T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.041246 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.041310 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.041368 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.041388 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.041397 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:20Z","lastTransitionTime":"2026-02-26T08:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.143868 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.143917 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.143926 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.143939 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.143947 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:20Z","lastTransitionTime":"2026-02-26T08:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.245982 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.246018 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.246028 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.246045 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.246061 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:20Z","lastTransitionTime":"2026-02-26T08:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.348960 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.349031 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.349053 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.349088 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.349114 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:20Z","lastTransitionTime":"2026-02-26T08:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.452705 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.452834 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.452846 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.452869 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.452921 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:20Z","lastTransitionTime":"2026-02-26T08:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.556133 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.556206 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.556227 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.556257 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.556275 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:20Z","lastTransitionTime":"2026-02-26T08:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.658742 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.658784 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.658792 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.658808 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.658817 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:20Z","lastTransitionTime":"2026-02-26T08:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.761742 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.761805 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.761825 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.761884 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.761903 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:20Z","lastTransitionTime":"2026-02-26T08:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.864728 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.864893 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.864923 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.865007 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.865064 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:20Z","lastTransitionTime":"2026-02-26T08:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.968422 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.968479 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.968488 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.968508 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:20 crc kubenswrapper[4925]: I0226 08:46:20.968540 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:20Z","lastTransitionTime":"2026-02-26T08:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.084758 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.084821 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.084837 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.084859 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.084874 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:21Z","lastTransitionTime":"2026-02-26T08:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.188084 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.188158 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.188181 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.188215 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.188242 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:21Z","lastTransitionTime":"2026-02-26T08:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.291693 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.291776 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.291799 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.291827 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.291850 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:21Z","lastTransitionTime":"2026-02-26T08:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.395393 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.395467 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.395503 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.395575 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.395611 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:21Z","lastTransitionTime":"2026-02-26T08:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.498925 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.498997 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.499009 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.499031 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.499046 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:21Z","lastTransitionTime":"2026-02-26T08:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.506278 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.506361 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:21 crc kubenswrapper[4925]: E0226 08:46:21.506431 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.506383 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.506383 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:21 crc kubenswrapper[4925]: E0226 08:46:21.506575 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:21 crc kubenswrapper[4925]: E0226 08:46:21.506743 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:21 crc kubenswrapper[4925]: E0226 08:46:21.506855 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.602646 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.602733 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.602774 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.602852 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.602881 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:21Z","lastTransitionTime":"2026-02-26T08:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.706365 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.706412 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.706421 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.706437 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.706448 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:21Z","lastTransitionTime":"2026-02-26T08:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.809381 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.809477 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.809502 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.809563 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.809588 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:21Z","lastTransitionTime":"2026-02-26T08:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.912166 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.912265 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.912314 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.912348 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:21 crc kubenswrapper[4925]: I0226 08:46:21.912369 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:21Z","lastTransitionTime":"2026-02-26T08:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.014751 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.014786 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.014795 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.014810 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.014818 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:22Z","lastTransitionTime":"2026-02-26T08:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.117620 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.117690 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.117708 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.117740 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.117762 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:22Z","lastTransitionTime":"2026-02-26T08:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.220697 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.220733 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.220744 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.220759 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.220769 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:22Z","lastTransitionTime":"2026-02-26T08:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.323566 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.323601 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.323610 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.323623 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.323634 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:22Z","lastTransitionTime":"2026-02-26T08:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.426490 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.426553 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.426563 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.426576 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.426588 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:22Z","lastTransitionTime":"2026-02-26T08:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.528599 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.528668 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.528678 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.528696 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.528705 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:22Z","lastTransitionTime":"2026-02-26T08:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.631235 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.631277 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.631286 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.631302 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.631311 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:22Z","lastTransitionTime":"2026-02-26T08:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.733896 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.734007 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.734096 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.734224 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.734252 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:22Z","lastTransitionTime":"2026-02-26T08:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.836521 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.836583 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.836592 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.836606 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.836633 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:22Z","lastTransitionTime":"2026-02-26T08:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.939156 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.939210 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.939221 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.939235 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:22 crc kubenswrapper[4925]: I0226 08:46:22.939246 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:22Z","lastTransitionTime":"2026-02-26T08:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.042226 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.042297 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.042309 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.042327 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.042339 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:23Z","lastTransitionTime":"2026-02-26T08:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.144197 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.144231 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.144241 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.144258 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.144269 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:23Z","lastTransitionTime":"2026-02-26T08:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.246908 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.246953 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.246964 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.246982 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.246995 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:23Z","lastTransitionTime":"2026-02-26T08:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.349470 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.349513 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.349521 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.349552 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.349561 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:23Z","lastTransitionTime":"2026-02-26T08:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.452935 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.452991 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.453003 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.453050 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.453065 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:23Z","lastTransitionTime":"2026-02-26T08:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.506805 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.506896 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.506839 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.506827 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:23 crc kubenswrapper[4925]: E0226 08:46:23.507073 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:23 crc kubenswrapper[4925]: E0226 08:46:23.507179 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:23 crc kubenswrapper[4925]: E0226 08:46:23.507268 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:23 crc kubenswrapper[4925]: E0226 08:46:23.507321 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.556565 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.556612 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.556628 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.556653 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.556670 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:23Z","lastTransitionTime":"2026-02-26T08:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.659568 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.659619 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.659633 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.659646 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.659655 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:23Z","lastTransitionTime":"2026-02-26T08:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.762608 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.762658 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.762667 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.762681 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.762692 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:23Z","lastTransitionTime":"2026-02-26T08:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.865405 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.865441 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.865450 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.865463 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.865472 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:23Z","lastTransitionTime":"2026-02-26T08:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.968602 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.968667 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.968685 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.968713 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:23 crc kubenswrapper[4925]: I0226 08:46:23.968731 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:23Z","lastTransitionTime":"2026-02-26T08:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.071741 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.071809 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.071833 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.071872 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.071900 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:24Z","lastTransitionTime":"2026-02-26T08:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.174722 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.174783 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.174792 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.174808 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.174819 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:24Z","lastTransitionTime":"2026-02-26T08:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.277520 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.277579 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.277589 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.277604 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.277616 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:24Z","lastTransitionTime":"2026-02-26T08:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.380737 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.380814 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.380833 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.380853 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.380865 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:24Z","lastTransitionTime":"2026-02-26T08:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.483373 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.483855 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.483980 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.484088 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.484191 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:24Z","lastTransitionTime":"2026-02-26T08:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.586951 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.587310 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.587419 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.587507 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.587692 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:24Z","lastTransitionTime":"2026-02-26T08:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.689831 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.689880 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.689892 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.689911 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.689923 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:24Z","lastTransitionTime":"2026-02-26T08:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.792713 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.792753 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.792764 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.792781 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.792792 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:24Z","lastTransitionTime":"2026-02-26T08:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.895586 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.895645 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.895668 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.895695 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.895715 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:24Z","lastTransitionTime":"2026-02-26T08:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.998568 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.998610 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.998621 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.998639 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:24 crc kubenswrapper[4925]: I0226 08:46:24.998654 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:24Z","lastTransitionTime":"2026-02-26T08:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.099965 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.100029 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.100050 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.100077 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.100098 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:25Z","lastTransitionTime":"2026-02-26T08:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.204147 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.204179 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.204189 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.204204 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.204214 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:25Z","lastTransitionTime":"2026-02-26T08:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.307058 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.307134 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.307161 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.307192 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.307214 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:25Z","lastTransitionTime":"2026-02-26T08:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.410586 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.410623 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.410637 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.410652 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.410662 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:25Z","lastTransitionTime":"2026-02-26T08:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.506765 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:25 crc kubenswrapper[4925]: E0226 08:46:25.506912 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.507130 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:25 crc kubenswrapper[4925]: E0226 08:46:25.507193 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.507315 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:25 crc kubenswrapper[4925]: E0226 08:46:25.507385 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.507520 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:25 crc kubenswrapper[4925]: E0226 08:46:25.507597 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.513505 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.513563 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.513576 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.513590 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.513601 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:25Z","lastTransitionTime":"2026-02-26T08:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.616822 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.616870 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.616880 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.616898 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.616912 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:25Z","lastTransitionTime":"2026-02-26T08:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.719701 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.719732 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.719740 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.719753 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.719763 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:25Z","lastTransitionTime":"2026-02-26T08:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.822436 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.822478 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.822489 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.822505 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.822518 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:25Z","lastTransitionTime":"2026-02-26T08:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.925474 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.925549 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.925567 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.925592 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:25 crc kubenswrapper[4925]: I0226 08:46:25.925613 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:25Z","lastTransitionTime":"2026-02-26T08:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.028378 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.028410 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.028419 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.028433 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.028442 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:26Z","lastTransitionTime":"2026-02-26T08:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.131554 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.131601 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.131612 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.131630 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.131643 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:26Z","lastTransitionTime":"2026-02-26T08:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.234276 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.234368 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.234382 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.234401 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.234413 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:26Z","lastTransitionTime":"2026-02-26T08:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.336628 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.336675 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.336688 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.336705 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.336717 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:26Z","lastTransitionTime":"2026-02-26T08:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.439007 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.439039 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.439047 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.439062 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.439071 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:26Z","lastTransitionTime":"2026-02-26T08:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.542124 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.542244 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.542263 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.542288 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.542306 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:26Z","lastTransitionTime":"2026-02-26T08:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.645194 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.645244 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.645258 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.645278 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.645299 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:26Z","lastTransitionTime":"2026-02-26T08:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.748960 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.749041 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.749071 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.749096 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.749114 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:26Z","lastTransitionTime":"2026-02-26T08:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.851933 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.852388 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.852412 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.852437 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.852456 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:26Z","lastTransitionTime":"2026-02-26T08:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.955372 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.955427 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.955437 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.955453 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:26 crc kubenswrapper[4925]: I0226 08:46:26.955463 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:26Z","lastTransitionTime":"2026-02-26T08:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.057671 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.057727 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.057744 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.057785 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.057803 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:27Z","lastTransitionTime":"2026-02-26T08:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.160468 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.160559 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.160579 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.160606 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.160622 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:27Z","lastTransitionTime":"2026-02-26T08:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.263039 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.263079 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.263088 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.263101 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.263110 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:27Z","lastTransitionTime":"2026-02-26T08:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.373101 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.373168 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.373185 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.373208 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.373224 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:27Z","lastTransitionTime":"2026-02-26T08:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.476063 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.476111 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.476121 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.476160 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.476172 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:27Z","lastTransitionTime":"2026-02-26T08:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.506615 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.506651 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.506652 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:27 crc kubenswrapper[4925]: E0226 08:46:27.506756 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.506767 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:27 crc kubenswrapper[4925]: E0226 08:46:27.506837 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:27 crc kubenswrapper[4925]: E0226 08:46:27.506970 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:27 crc kubenswrapper[4925]: E0226 08:46:27.507295 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.507603 4925 scope.go:117] "RemoveContainer" containerID="efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4" Feb 26 08:46:27 crc kubenswrapper[4925]: E0226 08:46:27.507765 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pv859_openshift-ovn-kubernetes(671c7a2c-75d0-4322-a581-3d1b26bda633)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" podUID="671c7a2c-75d0-4322-a581-3d1b26bda633" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.579427 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.579475 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.579487 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.579505 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.579518 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:27Z","lastTransitionTime":"2026-02-26T08:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.681988 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.682036 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.682054 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.682082 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.682099 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:27Z","lastTransitionTime":"2026-02-26T08:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.785326 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.785435 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.785457 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.785493 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.785516 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:27Z","lastTransitionTime":"2026-02-26T08:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.888503 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.888564 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.888575 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.888592 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.888605 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:27Z","lastTransitionTime":"2026-02-26T08:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.990996 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.991038 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.991047 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.991063 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:27 crc kubenswrapper[4925]: I0226 08:46:27.991074 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:27Z","lastTransitionTime":"2026-02-26T08:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.094247 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.094306 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.094317 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.094340 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.094353 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:28Z","lastTransitionTime":"2026-02-26T08:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.197357 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.197388 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.197400 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.197413 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.197422 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:28Z","lastTransitionTime":"2026-02-26T08:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.299902 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.299943 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.299953 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.299969 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.299978 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:28Z","lastTransitionTime":"2026-02-26T08:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.403120 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.403186 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.403200 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.403221 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.403233 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:28Z","lastTransitionTime":"2026-02-26T08:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:28 crc kubenswrapper[4925]: E0226 08:46:28.503947 4925 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.520750 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:28Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.541373 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9bb4d7-d949-4ead-b4d5-4a603104c6d2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11597156d0bd5fe891962baaac60089238f8ef371feffd647a62af174bb6dfbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aec7efa5468d3890f22dadb3185aa046a9de96f08c80d8a3ecaa0ea21c404dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5736abc0c618dc7ce8a2d18bba180888ba30d10c8648ba3d99309564717819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f29be76c16486a46b39eabf82b3cf56b4b9060dd7893da06a3f1de3ab7728a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c22a47f6f43936c0e85d447fd31a331cd28cbf20cafaa5111f576b7b9445d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:28Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.556001 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5524499eeebfb8bc504bd721b1d7d480ffabe222b1c69f03e9323f4825c0ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:28Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.569733 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:28Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.585220 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:28Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.602971 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:28Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.617936 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:28Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:28 crc kubenswrapper[4925]: E0226 08:46:28.619404 4925 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.634334 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:28Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.648598 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:28Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.663813 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:28Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.676223 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:28Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.689103 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:28Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.701069 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:28Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.714285 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:28Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.732366 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:46:11Z\\\",\\\"message\\\":\\\" 7057 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0226 08:46:11.381663 7057 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0226 08:46:11.381712 7057 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0226 08:46:11.381755 7057 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0226 08:46:11.381786 7057 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0226 08:46:11.381810 7057 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0226 08:46:11.381860 7057 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0226 08:46:11.381915 7057 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0226 08:46:11.381936 7057 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 08:46:11.381948 7057 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0226 08:46:11.381921 7057 handler.go:208] Removed *v1.Node event handler 2\\\\nI0226 08:46:11.381964 7057 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0226 08:46:11.381968 7057 handler.go:208] Removed *v1.Node event handler 7\\\\nI0226 08:46:11.381927 7057 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0226 08:46:11.382007 7057 factory.go:656] Stopping watch factory\\\\nI0226 08:46:11.382043 7057 ovnkube.go:599] Stopped ovnkube\\\\nI0226 08:46:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:46:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pv859_openshift-ovn-kubernetes(671c7a2c-75d0-4322-a581-3d1b26bda633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:28Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:28 crc kubenswrapper[4925]: I0226 08:46:28.743222 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:28Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:29 crc kubenswrapper[4925]: I0226 08:46:29.507080 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:29 crc kubenswrapper[4925]: I0226 08:46:29.507114 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:29 crc kubenswrapper[4925]: I0226 08:46:29.507114 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:29 crc kubenswrapper[4925]: I0226 08:46:29.507142 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:29 crc kubenswrapper[4925]: E0226 08:46:29.507320 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:29 crc kubenswrapper[4925]: E0226 08:46:29.507458 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:29 crc kubenswrapper[4925]: E0226 08:46:29.507646 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:29 crc kubenswrapper[4925]: E0226 08:46:29.507746 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.075583 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.075653 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.075663 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.075685 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.075697 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:30Z","lastTransitionTime":"2026-02-26T08:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:30 crc kubenswrapper[4925]: E0226 08:46:30.096761 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:30Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.101684 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.101756 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.101775 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.102025 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.102064 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:30Z","lastTransitionTime":"2026-02-26T08:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:30 crc kubenswrapper[4925]: E0226 08:46:30.120930 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:30Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.125634 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.125703 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.125721 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.125745 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.125791 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:30Z","lastTransitionTime":"2026-02-26T08:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:30 crc kubenswrapper[4925]: E0226 08:46:30.139687 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:30Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.144488 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.144574 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.144585 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.144611 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.144638 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:30Z","lastTransitionTime":"2026-02-26T08:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:30 crc kubenswrapper[4925]: E0226 08:46:30.159383 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:30Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.164422 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.164458 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.164468 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.164483 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:30 crc kubenswrapper[4925]: I0226 08:46:30.164493 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:30Z","lastTransitionTime":"2026-02-26T08:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:30 crc kubenswrapper[4925]: E0226 08:46:30.176875 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:30Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:30 crc kubenswrapper[4925]: E0226 08:46:30.177031 4925 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 08:46:31 crc kubenswrapper[4925]: I0226 08:46:31.507268 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:31 crc kubenswrapper[4925]: I0226 08:46:31.507360 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:31 crc kubenswrapper[4925]: I0226 08:46:31.507479 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:31 crc kubenswrapper[4925]: E0226 08:46:31.507691 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:31 crc kubenswrapper[4925]: I0226 08:46:31.507980 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:31 crc kubenswrapper[4925]: E0226 08:46:31.508050 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:31 crc kubenswrapper[4925]: E0226 08:46:31.508179 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:31 crc kubenswrapper[4925]: E0226 08:46:31.508413 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:33 crc kubenswrapper[4925]: I0226 08:46:33.506214 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:33 crc kubenswrapper[4925]: I0226 08:46:33.506214 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:33 crc kubenswrapper[4925]: I0226 08:46:33.506422 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:33 crc kubenswrapper[4925]: E0226 08:46:33.506338 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:33 crc kubenswrapper[4925]: I0226 08:46:33.506237 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:33 crc kubenswrapper[4925]: E0226 08:46:33.506727 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:33 crc kubenswrapper[4925]: E0226 08:46:33.506885 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:33 crc kubenswrapper[4925]: E0226 08:46:33.507054 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:33 crc kubenswrapper[4925]: I0226 08:46:33.507961 4925 scope.go:117] "RemoveContainer" containerID="450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011" Feb 26 08:46:33 crc kubenswrapper[4925]: E0226 08:46:33.620697 4925 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.174583 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.176911 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9bd4a14b45912a61145d72aaa6b5e181253b68ba75d012acaa27caa7f4cf7460"} Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.177542 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.178657 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fmxxp_03d05327-5621-49e8-8051-9dccd63f3a5b/kube-multus/0.log" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.178711 4925 generic.go:334] "Generic (PLEG): container finished" podID="03d05327-5621-49e8-8051-9dccd63f3a5b" containerID="5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b" exitCode=1 Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.178747 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fmxxp" event={"ID":"03d05327-5621-49e8-8051-9dccd63f3a5b","Type":"ContainerDied","Data":"5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b"} Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.179318 4925 scope.go:117] "RemoveContainer" containerID="5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.197628 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.215763 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5524499eeebfb8bc504bd721b1d7d480ffabe222b1c69f03e9323f4825c0ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.230882 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.243993 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.263101 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd4a14b45912a61145d72aaa6b5e181253b68ba75d012acaa27caa7f4cf7460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.279179 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.294250 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.310431 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.332641 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:46:11Z\\\",\\\"message\\\":\\\" 7057 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0226 08:46:11.381663 7057 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0226 08:46:11.381712 7057 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0226 08:46:11.381755 7057 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0226 08:46:11.381786 7057 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0226 08:46:11.381810 7057 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0226 08:46:11.381860 7057 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0226 08:46:11.381915 7057 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0226 08:46:11.381936 7057 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 08:46:11.381948 7057 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0226 08:46:11.381921 7057 handler.go:208] Removed *v1.Node event handler 2\\\\nI0226 08:46:11.381964 7057 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0226 08:46:11.381968 7057 handler.go:208] Removed *v1.Node event handler 7\\\\nI0226 08:46:11.381927 7057 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0226 08:46:11.382007 7057 factory.go:656] Stopping watch factory\\\\nI0226 08:46:11.382043 7057 ovnkube.go:599] Stopped ovnkube\\\\nI0226 08:46:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:46:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pv859_openshift-ovn-kubernetes(671c7a2c-75d0-4322-a581-3d1b26bda633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.345111 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.359655 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.373000 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.391127 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.407017 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.429173 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9bb4d7-d949-4ead-b4d5-4a603104c6d2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11597156d0bd5fe891962baaac60089238f8ef371feffd647a62af174bb6dfbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aec7efa5468d3890f22dadb3185aa046a9de96f08c80d8a3ecaa0ea21c404dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5736abc0c618dc7ce8a2d18bba180888ba30d10c8648ba3d99309564717819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f29be76c16486a46b39eabf82b3cf56b4b9060dd7893da06a3f1de3ab7728a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c22a47f6f43936c0e85d447fd31a331cd28cbf20cafaa5111f576b7b9445d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.443107 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.458749 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd4a14b45912a61145d72aaa6b5e181253b68ba75d012acaa27caa7f4cf7460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.482560 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.496975 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.513343 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.527133 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.538901 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.552692 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.564558 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.576182 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.589826 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.608699 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:46:33Z\\\",\\\"message\\\":\\\"2026-02-26T08:45:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b4b70cc5-d98c-40fd-b31f-4093c4dc2d53\\\\n2026-02-26T08:45:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b4b70cc5-d98c-40fd-b31f-4093c4dc2d53 to /host/opt/cni/bin/\\\\n2026-02-26T08:45:48Z [verbose] multus-daemon started\\\\n2026-02-26T08:45:48Z [verbose] Readiness Indicator file check\\\\n2026-02-26T08:46:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.630117 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:46:11Z\\\",\\\"message\\\":\\\" 7057 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0226 08:46:11.381663 7057 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0226 08:46:11.381712 7057 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0226 08:46:11.381755 7057 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0226 08:46:11.381786 7057 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0226 08:46:11.381810 7057 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0226 08:46:11.381860 7057 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0226 08:46:11.381915 7057 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0226 08:46:11.381936 7057 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 08:46:11.381948 7057 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0226 08:46:11.381921 7057 handler.go:208] Removed *v1.Node event handler 2\\\\nI0226 08:46:11.381964 7057 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0226 08:46:11.381968 7057 handler.go:208] Removed *v1.Node event handler 7\\\\nI0226 08:46:11.381927 7057 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0226 08:46:11.382007 7057 factory.go:656] Stopping watch factory\\\\nI0226 08:46:11.382043 7057 ovnkube.go:599] Stopped ovnkube\\\\nI0226 08:46:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:46:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pv859_openshift-ovn-kubernetes(671c7a2c-75d0-4322-a581-3d1b26bda633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.651351 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9bb4d7-d949-4ead-b4d5-4a603104c6d2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11597156d0bd5fe891962baaac60089238f8ef371feffd647a62af174bb6dfbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aec7efa5468d3890f22dadb3185aa046a9de96f08c80d8a3ecaa0ea21c404dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5736abc0c618dc7ce8a2d18bba180888ba30d10c8648ba3d99309564717819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f29be76c16486a46b39eabf82b3cf56b4b9060dd7893da06a3f1de3ab7728a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c22a47f6f43936c0e85d447fd31a331cd28cbf20cafaa5111f576b7b9445d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.663749 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.678127 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:34 crc kubenswrapper[4925]: I0226 08:46:34.697282 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5524499eeebfb8bc504bd721b1d7d480ffabe222b1c69f03e9323f4825c0ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:34Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:35 crc kubenswrapper[4925]: I0226 08:46:35.185497 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fmxxp_03d05327-5621-49e8-8051-9dccd63f3a5b/kube-multus/0.log" Feb 26 08:46:35 crc kubenswrapper[4925]: I0226 08:46:35.185616 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fmxxp" event={"ID":"03d05327-5621-49e8-8051-9dccd63f3a5b","Type":"ContainerStarted","Data":"bafcab86ecc3c83d9dfbd477b70d92f523243231043e280c7f37f9c8e5944a85"} Feb 26 08:46:35 crc kubenswrapper[4925]: I0226 08:46:35.200156 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:35Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:35 crc kubenswrapper[4925]: I0226 08:46:35.215300 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafcab86ecc3c83d9dfbd477b70d92f523243231043e280c7f37f9c8e5944a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:46:33Z\\\",\\\"message\\\":\\\"2026-02-26T08:45:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b4b70cc5-d98c-40fd-b31f-4093c4dc2d53\\\\n2026-02-26T08:45:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b4b70cc5-d98c-40fd-b31f-4093c4dc2d53 to /host/opt/cni/bin/\\\\n2026-02-26T08:45:48Z [verbose] multus-daemon started\\\\n2026-02-26T08:45:48Z [verbose] Readiness Indicator file check\\\\n2026-02-26T08:46:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:35Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:35 crc kubenswrapper[4925]: I0226 08:46:35.243589 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:46:11Z\\\",\\\"message\\\":\\\" 7057 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0226 08:46:11.381663 7057 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0226 08:46:11.381712 7057 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0226 08:46:11.381755 7057 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0226 08:46:11.381786 7057 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0226 08:46:11.381810 7057 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0226 08:46:11.381860 7057 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0226 08:46:11.381915 7057 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0226 08:46:11.381936 7057 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 08:46:11.381948 7057 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0226 08:46:11.381921 7057 handler.go:208] Removed *v1.Node event handler 2\\\\nI0226 08:46:11.381964 7057 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0226 08:46:11.381968 7057 handler.go:208] Removed *v1.Node event handler 7\\\\nI0226 08:46:11.381927 7057 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0226 08:46:11.382007 7057 factory.go:656] Stopping watch factory\\\\nI0226 08:46:11.382043 7057 ovnkube.go:599] Stopped ovnkube\\\\nI0226 08:46:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:46:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pv859_openshift-ovn-kubernetes(671c7a2c-75d0-4322-a581-3d1b26bda633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:35Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:35 crc kubenswrapper[4925]: I0226 08:46:35.259186 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:35Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:35 crc kubenswrapper[4925]: I0226 08:46:35.272388 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:35Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:35 crc kubenswrapper[4925]: I0226 08:46:35.283991 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:35Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:35 crc kubenswrapper[4925]: I0226 08:46:35.297595 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:35Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:35 crc kubenswrapper[4925]: I0226 08:46:35.322795 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9bb4d7-d949-4ead-b4d5-4a603104c6d2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11597156d0bd5fe891962baaac60089238f8ef371feffd647a62af174bb6dfbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aec7efa5468d3890f22dadb3185aa046a9de96f08c80d8a3ecaa0ea21c404dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5736abc0c618dc7ce8a2d18bba180888ba30d10c8648ba3d99309564717819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f29be76c16486a46b39eabf82b3cf56b4b9060dd7893da06a3f1de3ab7728a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c22a47f6f43936c0e85d447fd31a331cd28cbf20cafaa5111f576b7b9445d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:35Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:35 crc kubenswrapper[4925]: I0226 08:46:35.337196 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:35Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:35 crc kubenswrapper[4925]: I0226 08:46:35.350190 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:35Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:35 crc kubenswrapper[4925]: I0226 08:46:35.369801 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5524499eeebfb8bc504bd721b1d7d480ffabe222b1c69f03e9323f4825c0ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:35Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:35 crc kubenswrapper[4925]: I0226 08:46:35.383840 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:35Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:35 crc kubenswrapper[4925]: I0226 08:46:35.402622 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:35Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:35 crc kubenswrapper[4925]: I0226 08:46:35.414034 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:35Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:35 crc kubenswrapper[4925]: I0226 08:46:35.431631 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd4a14b45912a61145d72aaa6b5e181253b68ba75d012acaa27caa7f4cf7460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:35Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:35 crc kubenswrapper[4925]: I0226 08:46:35.452259 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:35Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:35 crc kubenswrapper[4925]: I0226 08:46:35.506701 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:35 crc kubenswrapper[4925]: I0226 08:46:35.506777 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:35 crc kubenswrapper[4925]: I0226 08:46:35.506790 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:35 crc kubenswrapper[4925]: I0226 08:46:35.506701 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:35 crc kubenswrapper[4925]: E0226 08:46:35.506946 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:35 crc kubenswrapper[4925]: E0226 08:46:35.507402 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:35 crc kubenswrapper[4925]: E0226 08:46:35.507286 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:35 crc kubenswrapper[4925]: E0226 08:46:35.507460 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:37 crc kubenswrapper[4925]: I0226 08:46:37.506410 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:37 crc kubenswrapper[4925]: I0226 08:46:37.506476 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:37 crc kubenswrapper[4925]: E0226 08:46:37.506663 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:37 crc kubenswrapper[4925]: I0226 08:46:37.506798 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:37 crc kubenswrapper[4925]: I0226 08:46:37.506821 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:37 crc kubenswrapper[4925]: E0226 08:46:37.507028 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:37 crc kubenswrapper[4925]: E0226 08:46:37.507127 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:37 crc kubenswrapper[4925]: E0226 08:46:37.507300 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:38 crc kubenswrapper[4925]: I0226 08:46:38.526293 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 26 08:46:38 crc kubenswrapper[4925]: I0226 08:46:38.548903 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9bb4d7-d949-4ead-b4d5-4a603104c6d2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11597156d0bd5fe891962baaac60089238f8ef371feffd647a62af174bb6dfbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aec7efa5468d3890f22dadb3185aa046a9de96f08c80d8a3ecaa0ea21c404dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5736abc0c618dc7ce8a2d18bba180888ba30d10c8648ba3d99309564717819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f29be76c16486a46b39eabf82b3cf56b4b9060dd7893da06a3f1de3ab7728a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c22a47f6f43936c0e85d447fd31a331cd28cbf20cafaa5111f576b7b9445d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:38Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:38 crc kubenswrapper[4925]: I0226 08:46:38.562383 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:38Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:38 crc kubenswrapper[4925]: I0226 08:46:38.575637 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:38Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:38 crc kubenswrapper[4925]: I0226 08:46:38.589928 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5524499eeebfb8bc504bd721b1d7d480ffabe222b1c69f03e9323f4825c0ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:38Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:38 crc kubenswrapper[4925]: I0226 08:46:38.604368 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:38Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:38 crc kubenswrapper[4925]: I0226 08:46:38.619194 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:38Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:38 crc kubenswrapper[4925]: E0226 08:46:38.622259 4925 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 08:46:38 crc kubenswrapper[4925]: I0226 08:46:38.640336 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd4a14b45912a61145d72aaa6b5e181253b68ba75d012acaa27caa7f4cf7460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:38Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:38 crc kubenswrapper[4925]: I0226 08:46:38.653913 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:38Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:38 crc kubenswrapper[4925]: I0226 08:46:38.666115 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:38Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:38 crc kubenswrapper[4925]: I0226 08:46:38.678000 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafcab86ecc3c83d9dfbd477b70d92f523243231043e280c7f37f9c8e5944a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:46:33Z\\\",\\\"message\\\":\\\"2026-02-26T08:45:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b4b70cc5-d98c-40fd-b31f-4093c4dc2d53\\\\n2026-02-26T08:45:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b4b70cc5-d98c-40fd-b31f-4093c4dc2d53 to /host/opt/cni/bin/\\\\n2026-02-26T08:45:48Z [verbose] multus-daemon started\\\\n2026-02-26T08:45:48Z [verbose] Readiness Indicator file check\\\\n2026-02-26T08:46:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:38Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:38 crc kubenswrapper[4925]: I0226 08:46:38.695117 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:46:11Z\\\",\\\"message\\\":\\\" 7057 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0226 08:46:11.381663 7057 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0226 08:46:11.381712 7057 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0226 08:46:11.381755 7057 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0226 08:46:11.381786 7057 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0226 08:46:11.381810 7057 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0226 08:46:11.381860 7057 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0226 08:46:11.381915 7057 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0226 08:46:11.381936 7057 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 08:46:11.381948 7057 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0226 08:46:11.381921 7057 handler.go:208] Removed *v1.Node event handler 2\\\\nI0226 08:46:11.381964 7057 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0226 08:46:11.381968 7057 handler.go:208] Removed *v1.Node event handler 7\\\\nI0226 08:46:11.381927 7057 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0226 08:46:11.382007 7057 factory.go:656] Stopping watch factory\\\\nI0226 08:46:11.382043 7057 ovnkube.go:599] Stopped ovnkube\\\\nI0226 08:46:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:46:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pv859_openshift-ovn-kubernetes(671c7a2c-75d0-4322-a581-3d1b26bda633)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:38Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:38 crc kubenswrapper[4925]: I0226 08:46:38.704642 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:38Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:38 crc kubenswrapper[4925]: I0226 08:46:38.719388 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:38Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:38 crc kubenswrapper[4925]: I0226 08:46:38.730888 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:38Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:38 crc kubenswrapper[4925]: I0226 08:46:38.742426 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:38Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:38 crc kubenswrapper[4925]: I0226 08:46:38.754358 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:38Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:39 crc kubenswrapper[4925]: I0226 08:46:39.506678 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:39 crc kubenswrapper[4925]: I0226 08:46:39.506711 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:39 crc kubenswrapper[4925]: I0226 08:46:39.506672 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:39 crc kubenswrapper[4925]: E0226 08:46:39.506878 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:39 crc kubenswrapper[4925]: E0226 08:46:39.507037 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:39 crc kubenswrapper[4925]: E0226 08:46:39.507170 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:39 crc kubenswrapper[4925]: I0226 08:46:39.507380 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:39 crc kubenswrapper[4925]: E0226 08:46:39.507685 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.386421 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.386467 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.386480 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.386497 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.386508 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:40Z","lastTransitionTime":"2026-02-26T08:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:40 crc kubenswrapper[4925]: E0226 08:46:40.399971 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:40Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.403794 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.403933 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.404012 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.404084 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.404150 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:40Z","lastTransitionTime":"2026-02-26T08:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:40 crc kubenswrapper[4925]: E0226 08:46:40.421519 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:40Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.425214 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.425336 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.425415 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.425491 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.425573 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:40Z","lastTransitionTime":"2026-02-26T08:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:40 crc kubenswrapper[4925]: E0226 08:46:40.441088 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:40Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.446951 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.447085 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.447171 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.447263 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.447341 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:40Z","lastTransitionTime":"2026-02-26T08:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:40 crc kubenswrapper[4925]: E0226 08:46:40.461216 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:40Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.465984 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.466028 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.466042 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.466060 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:40 crc kubenswrapper[4925]: I0226 08:46:40.466073 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:40Z","lastTransitionTime":"2026-02-26T08:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:40 crc kubenswrapper[4925]: E0226 08:46:40.503576 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:40Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:40 crc kubenswrapper[4925]: E0226 08:46:40.503723 4925 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 08:46:41 crc kubenswrapper[4925]: I0226 08:46:41.506429 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:41 crc kubenswrapper[4925]: I0226 08:46:41.506580 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:41 crc kubenswrapper[4925]: I0226 08:46:41.507130 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:41 crc kubenswrapper[4925]: I0226 08:46:41.507357 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:41 crc kubenswrapper[4925]: E0226 08:46:41.507358 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:41 crc kubenswrapper[4925]: E0226 08:46:41.507797 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:41 crc kubenswrapper[4925]: E0226 08:46:41.507935 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:41 crc kubenswrapper[4925]: E0226 08:46:41.508014 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:41 crc kubenswrapper[4925]: I0226 08:46:41.508122 4925 scope.go:117] "RemoveContainer" containerID="efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4" Feb 26 08:46:42 crc kubenswrapper[4925]: I0226 08:46:42.214323 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pv859_671c7a2c-75d0-4322-a581-3d1b26bda633/ovnkube-controller/2.log" Feb 26 08:46:42 crc kubenswrapper[4925]: I0226 08:46:42.218862 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" event={"ID":"671c7a2c-75d0-4322-a581-3d1b26bda633","Type":"ContainerStarted","Data":"2c7dc4395414e112ae29765029027afe8e453d9d9fadc956293c8452aa474f85"} Feb 26 08:46:42 crc kubenswrapper[4925]: I0226 08:46:42.219471 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:46:42 crc kubenswrapper[4925]: I0226 08:46:42.235279 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:42Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:42 crc kubenswrapper[4925]: I0226 08:46:42.253228 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5524499eeebfb8bc504bd721b1d7d480ffabe222b1c69f03e9323f4825c0ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:42Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:42 crc kubenswrapper[4925]: I0226 08:46:42.276886 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:42Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:42 crc kubenswrapper[4925]: I0226 08:46:42.296330 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:42Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:42 crc kubenswrapper[4925]: I0226 08:46:42.312187 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:42Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:42 crc kubenswrapper[4925]: I0226 08:46:42.325274 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:42Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:42 crc kubenswrapper[4925]: I0226 08:46:42.341382 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd4a14b45912a61145d72aaa6b5e181253b68ba75d012acaa27caa7f4cf7460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:42Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:42 crc kubenswrapper[4925]: I0226 08:46:42.352164 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:42Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:42 crc kubenswrapper[4925]: I0226 08:46:42.366697 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:42Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:42 crc kubenswrapper[4925]: I0226 08:46:42.381355 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:42Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:42 crc kubenswrapper[4925]: I0226 08:46:42.396662 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafcab86ecc3c83d9dfbd477b70d92f523243231043e280c7f37f9c8e5944a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:46:33Z\\\",\\\"message\\\":\\\"2026-02-26T08:45:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b4b70cc5-d98c-40fd-b31f-4093c4dc2d53\\\\n2026-02-26T08:45:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b4b70cc5-d98c-40fd-b31f-4093c4dc2d53 to /host/opt/cni/bin/\\\\n2026-02-26T08:45:48Z [verbose] multus-daemon started\\\\n2026-02-26T08:45:48Z [verbose] Readiness Indicator file check\\\\n2026-02-26T08:46:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:42Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:42 crc kubenswrapper[4925]: I0226 08:46:42.418545 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c7dc4395414e112ae29765029027afe8e453d9d9fadc956293c8452aa474f85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:46:11Z\\\",\\\"message\\\":\\\" 7057 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0226 08:46:11.381663 7057 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0226 08:46:11.381712 7057 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0226 08:46:11.381755 7057 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0226 08:46:11.381786 7057 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0226 08:46:11.381810 7057 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0226 08:46:11.381860 7057 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0226 08:46:11.381915 7057 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0226 08:46:11.381936 7057 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 08:46:11.381948 7057 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0226 08:46:11.381921 7057 handler.go:208] Removed *v1.Node event handler 2\\\\nI0226 08:46:11.381964 7057 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0226 08:46:11.381968 7057 handler.go:208] Removed *v1.Node event handler 7\\\\nI0226 08:46:11.381927 7057 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0226 08:46:11.382007 7057 factory.go:656] Stopping watch factory\\\\nI0226 08:46:11.382043 7057 ovnkube.go:599] Stopped ovnkube\\\\nI0226 08:46:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:46:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:42Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:42 crc kubenswrapper[4925]: I0226 08:46:42.430069 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:42Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:42 crc kubenswrapper[4925]: I0226 08:46:42.445815 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:42Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:42 crc kubenswrapper[4925]: I0226 08:46:42.457587 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:42Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:42 crc kubenswrapper[4925]: I0226 08:46:42.476784 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9bb4d7-d949-4ead-b4d5-4a603104c6d2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11597156d0bd5fe891962baaac60089238f8ef371feffd647a62af174bb6dfbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aec7efa5468d3890f22dadb3185aa046a9de96f08c80d8a3ecaa0ea21c404dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5736abc0c618dc7ce8a2d18bba180888ba30d10c8648ba3d99309564717819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f29be76c16486a46b39eabf82b3cf56b4b9060dd7893da06a3f1de3ab7728a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c22a47f6f43936c0e85d447fd31a331cd28cbf20cafaa5111f576b7b9445d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:42Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:42 crc kubenswrapper[4925]: I0226 08:46:42.491048 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d9e1b68-1e45-437a-aedf-893164e03e6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81f36248f49499708f8ebd43d539ad4ddcbbc0b098a87af3ded6150b306caf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf88c570a0cceaa05b873b25d7780e7344246be869f48af83108eada3351e411\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:44:53Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 08:44:30.772197 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 08:44:30.779873 1 observer_polling.go:159] Starting file observer\\\\nI0226 08:44:30.828121 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 08:44:30.834123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0226 08:44:53.825355 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0226 08:44:53.825481 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:44:53Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://044a78201ffbc5765ab36a83077c152e1d0f0d1d05cd6407aab6851a06e3681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd87d89fc17a251143df8b8fcef958a2d67a7e16d107bf027ecdcbc1865a0d01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c505167e891bafbbdc10082c6e18071e83368b0995c978ac646e943b3d9a5d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:42Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:43 crc kubenswrapper[4925]: I0226 08:46:43.506479 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:43 crc kubenswrapper[4925]: I0226 08:46:43.506645 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:43 crc kubenswrapper[4925]: I0226 08:46:43.506680 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:43 crc kubenswrapper[4925]: I0226 08:46:43.506644 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:43 crc kubenswrapper[4925]: E0226 08:46:43.506847 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:43 crc kubenswrapper[4925]: E0226 08:46:43.506929 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:43 crc kubenswrapper[4925]: E0226 08:46:43.506683 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:43 crc kubenswrapper[4925]: E0226 08:46:43.507077 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:43 crc kubenswrapper[4925]: E0226 08:46:43.623353 4925 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 08:46:45 crc kubenswrapper[4925]: I0226 08:46:45.506891 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:45 crc kubenswrapper[4925]: I0226 08:46:45.506928 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:45 crc kubenswrapper[4925]: I0226 08:46:45.506972 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:45 crc kubenswrapper[4925]: E0226 08:46:45.507028 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:45 crc kubenswrapper[4925]: I0226 08:46:45.507050 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:45 crc kubenswrapper[4925]: E0226 08:46:45.507182 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:45 crc kubenswrapper[4925]: E0226 08:46:45.507228 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:45 crc kubenswrapper[4925]: E0226 08:46:45.507276 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:47 crc kubenswrapper[4925]: I0226 08:46:47.506934 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:47 crc kubenswrapper[4925]: I0226 08:46:47.507011 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:47 crc kubenswrapper[4925]: I0226 08:46:47.507014 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:47 crc kubenswrapper[4925]: E0226 08:46:47.507104 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:47 crc kubenswrapper[4925]: I0226 08:46:47.506958 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:47 crc kubenswrapper[4925]: E0226 08:46:47.507257 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:47 crc kubenswrapper[4925]: E0226 08:46:47.507379 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:47 crc kubenswrapper[4925]: E0226 08:46:47.507503 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.526632 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9bb4d7-d949-4ead-b4d5-4a603104c6d2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11597156d0bd5fe891962baaac60089238f8ef371feffd647a62af174bb6dfbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aec7efa5468d3890f22dadb3185aa046a9de96f08c80d8a3ecaa0ea21c404dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5736abc0c618dc7ce8a2d18bba180888ba30d10c8648ba3d99309564717819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f29be76c16486a46b39eabf82b3cf56b4b9060dd7893da06a3f1de3ab7728a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c22a47f6f43936c0e85d447fd31a331cd28cbf20cafaa5111f576b7b9445d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.540017 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d9e1b68-1e45-437a-aedf-893164e03e6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81f36248f49499708f8ebd43d539ad4ddcbbc0b098a87af3ded6150b306caf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf88c570a0cceaa05b873b25d7780e7344246be869f48af83108eada3351e411\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:44:53Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 08:44:30.772197 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 08:44:30.779873 1 observer_polling.go:159] Starting file observer\\\\nI0226 08:44:30.828121 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 08:44:30.834123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0226 08:44:53.825355 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0226 08:44:53.825481 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:44:53Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://044a78201ffbc5765ab36a83077c152e1d0f0d1d05cd6407aab6851a06e3681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd87d89fc17a251143df8b8fcef958a2d67a7e16d107bf027ecdcbc1865a0d01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c505167e891bafbbdc10082c6e18071e83368b0995c978ac646e943b3d9a5d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.551572 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.568872 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.585484 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5524499eeebfb8bc504bd721b1d7d480ffabe222b1c69f03e9323f4825c0ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.600457 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.612893 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: E0226 08:46:48.625152 4925 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.626938 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.639412 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.652612 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd4a14b45912a61145d72aaa6b5e181253b68ba75d012acaa27caa7f4cf7460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.663027 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.663158 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.674106 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.688121 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafcab86ecc3c83d9dfbd477b70d92f523243231043e280c7f37f9c8e5944a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:46:33Z\\\",\\\"message\\\":\\\"2026-02-26T08:45:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b4b70cc5-d98c-40fd-b31f-4093c4dc2d53\\\\n2026-02-26T08:45:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b4b70cc5-d98c-40fd-b31f-4093c4dc2d53 to /host/opt/cni/bin/\\\\n2026-02-26T08:45:48Z [verbose] multus-daemon started\\\\n2026-02-26T08:45:48Z [verbose] Readiness Indicator file check\\\\n2026-02-26T08:46:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.709062 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c7dc4395414e112ae29765029027afe8e453d9d9fadc956293c8452aa474f85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:46:11Z\\\",\\\"message\\\":\\\" 7057 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0226 08:46:11.381663 7057 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0226 08:46:11.381712 7057 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0226 08:46:11.381755 7057 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0226 08:46:11.381786 7057 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0226 08:46:11.381810 7057 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0226 08:46:11.381860 7057 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0226 08:46:11.381915 7057 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0226 08:46:11.381936 7057 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 08:46:11.381948 7057 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0226 08:46:11.381921 7057 handler.go:208] Removed *v1.Node event handler 2\\\\nI0226 08:46:11.381964 7057 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0226 08:46:11.381968 7057 handler.go:208] Removed *v1.Node event handler 7\\\\nI0226 08:46:11.381927 7057 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0226 08:46:11.382007 7057 factory.go:656] Stopping watch factory\\\\nI0226 08:46:11.382043 7057 ovnkube.go:599] Stopped ovnkube\\\\nI0226 08:46:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:46:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.719925 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.733423 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.746018 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.759195 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.771702 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.783752 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.797342 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.806735 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.820105 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafcab86ecc3c83d9dfbd477b70d92f523243231043e280c7f37f9c8e5944a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:46:33Z\\\",\\\"message\\\":\\\"2026-02-26T08:45:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b4b70cc5-d98c-40fd-b31f-4093c4dc2d53\\\\n2026-02-26T08:45:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b4b70cc5-d98c-40fd-b31f-4093c4dc2d53 to /host/opt/cni/bin/\\\\n2026-02-26T08:45:48Z [verbose] multus-daemon started\\\\n2026-02-26T08:45:48Z [verbose] Readiness Indicator file check\\\\n2026-02-26T08:46:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.840567 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c7dc4395414e112ae29765029027afe8e453d9d9fadc956293c8452aa474f85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:46:11Z\\\",\\\"message\\\":\\\" 7057 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0226 08:46:11.381663 7057 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0226 08:46:11.381712 7057 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0226 08:46:11.381755 7057 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0226 08:46:11.381786 7057 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0226 08:46:11.381810 7057 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0226 08:46:11.381860 7057 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0226 08:46:11.381915 7057 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0226 08:46:11.381936 7057 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 08:46:11.381948 7057 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0226 08:46:11.381921 7057 handler.go:208] Removed *v1.Node event handler 2\\\\nI0226 08:46:11.381964 7057 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0226 08:46:11.381968 7057 handler.go:208] Removed *v1.Node event handler 7\\\\nI0226 08:46:11.381927 7057 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0226 08:46:11.382007 7057 factory.go:656] Stopping watch factory\\\\nI0226 08:46:11.382043 7057 ovnkube.go:599] Stopped ovnkube\\\\nI0226 08:46:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:46:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.862585 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9bb4d7-d949-4ead-b4d5-4a603104c6d2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11597156d0bd5fe891962baaac60089238f8ef371feffd647a62af174bb6dfbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aec7efa5468d3890f22dadb3185aa046a9de96f08c80d8a3ecaa0ea21c404dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5736abc0c618dc7ce8a2d18bba180888ba30d10c8648ba3d99309564717819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f29be76c16486a46b39eabf82b3cf56b4b9060dd7893da06a3f1de3ab7728a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c22a47f6f43936c0e85d447fd31a331cd28cbf20cafaa5111f576b7b9445d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.876665 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d9e1b68-1e45-437a-aedf-893164e03e6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81f36248f49499708f8ebd43d539ad4ddcbbc0b098a87af3ded6150b306caf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf88c570a0cceaa05b873b25d7780e7344246be869f48af83108eada3351e411\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:44:53Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 08:44:30.772197 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 08:44:30.779873 1 observer_polling.go:159] Starting file observer\\\\nI0226 08:44:30.828121 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 08:44:30.834123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0226 08:44:53.825355 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0226 08:44:53.825481 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:44:53Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://044a78201ffbc5765ab36a83077c152e1d0f0d1d05cd6407aab6851a06e3681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd87d89fc17a251143df8b8fcef958a2d67a7e16d107bf027ecdcbc1865a0d01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c505167e891bafbbdc10082c6e18071e83368b0995c978ac646e943b3d9a5d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.888345 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.900103 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.915394 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5524499eeebfb8bc504bd721b1d7d480ffabe222b1c69f03e9323f4825c0ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.931025 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd4a14b45912a61145d72aaa6b5e181253b68ba75d012acaa27caa7f4cf7460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.945755 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.958722 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.970479 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:48 crc kubenswrapper[4925]: I0226 08:46:48.981618 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:48Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:49 crc kubenswrapper[4925]: I0226 08:46:49.506277 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:49 crc kubenswrapper[4925]: E0226 08:46:49.506597 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:49 crc kubenswrapper[4925]: I0226 08:46:49.506639 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:49 crc kubenswrapper[4925]: I0226 08:46:49.506713 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:49 crc kubenswrapper[4925]: E0226 08:46:49.506831 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:49 crc kubenswrapper[4925]: I0226 08:46:49.506882 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:49 crc kubenswrapper[4925]: E0226 08:46:49.506962 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:49 crc kubenswrapper[4925]: E0226 08:46:49.507202 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:49 crc kubenswrapper[4925]: I0226 08:46:49.519102 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 26 08:46:49 crc kubenswrapper[4925]: I0226 08:46:49.708956 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:46:49 crc kubenswrapper[4925]: I0226 08:46:49.709044 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs\") pod \"network-metrics-daemon-96njb\" (UID: \"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\") " pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:49 crc kubenswrapper[4925]: I0226 08:46:49.709068 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:49 crc kubenswrapper[4925]: I0226 08:46:49.709090 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:49 crc kubenswrapper[4925]: E0226 08:46:49.709117 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:53.709096483 +0000 UTC m=+205.848670766 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:46:49 crc kubenswrapper[4925]: I0226 08:46:49.709144 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:49 crc kubenswrapper[4925]: E0226 08:46:49.709169 4925 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 08:46:49 crc kubenswrapper[4925]: I0226 08:46:49.709175 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:49 crc kubenswrapper[4925]: E0226 08:46:49.709205 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 08:47:53.709195206 +0000 UTC m=+205.848769479 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 08:46:49 crc kubenswrapper[4925]: E0226 08:46:49.709265 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 08:46:49 crc kubenswrapper[4925]: E0226 08:46:49.709268 4925 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 08:46:49 crc kubenswrapper[4925]: E0226 08:46:49.709388 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs podName:c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc nodeName:}" failed. No retries permitted until 2026-02-26 08:47:53.70936197 +0000 UTC m=+205.848936293 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs") pod "network-metrics-daemon-96njb" (UID: "c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 08:46:49 crc kubenswrapper[4925]: E0226 08:46:49.709389 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 08:46:49 crc kubenswrapper[4925]: E0226 08:46:49.709446 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 08:46:49 crc kubenswrapper[4925]: E0226 08:46:49.709464 4925 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:46:49 crc kubenswrapper[4925]: E0226 08:46:49.709281 4925 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 08:46:49 crc kubenswrapper[4925]: E0226 08:46:49.709281 4925 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 08:46:49 crc kubenswrapper[4925]: E0226 08:46:49.709567 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 08:47:53.709516944 +0000 UTC m=+205.849091287 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:46:49 crc kubenswrapper[4925]: E0226 08:46:49.709581 4925 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:46:49 crc kubenswrapper[4925]: E0226 08:46:49.709598 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 08:47:53.709584365 +0000 UTC m=+205.849158768 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 08:46:49 crc kubenswrapper[4925]: E0226 08:46:49.709626 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 08:47:53.709610586 +0000 UTC m=+205.849184959 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.609850 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.609889 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.609898 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.609913 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.609923 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:50Z","lastTransitionTime":"2026-02-26T08:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:50 crc kubenswrapper[4925]: E0226 08:46:50.627495 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.631943 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.631975 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.631986 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.632000 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.632010 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:50Z","lastTransitionTime":"2026-02-26T08:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:50 crc kubenswrapper[4925]: E0226 08:46:50.650821 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.655480 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.655516 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.655543 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.655560 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.655573 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:50Z","lastTransitionTime":"2026-02-26T08:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:50 crc kubenswrapper[4925]: E0226 08:46:50.674882 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.678843 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.678875 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.678891 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.678907 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.678918 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:50Z","lastTransitionTime":"2026-02-26T08:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:50 crc kubenswrapper[4925]: E0226 08:46:50.695946 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.700782 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.700814 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.700825 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.700842 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:46:50 crc kubenswrapper[4925]: I0226 08:46:50.700856 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:46:50Z","lastTransitionTime":"2026-02-26T08:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:46:50 crc kubenswrapper[4925]: E0226 08:46:50.714792 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:50Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:50 crc kubenswrapper[4925]: E0226 08:46:50.714901 4925 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 08:46:51 crc kubenswrapper[4925]: I0226 08:46:51.506488 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:51 crc kubenswrapper[4925]: I0226 08:46:51.506563 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:51 crc kubenswrapper[4925]: I0226 08:46:51.506619 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:51 crc kubenswrapper[4925]: I0226 08:46:51.506488 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:51 crc kubenswrapper[4925]: E0226 08:46:51.506781 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:51 crc kubenswrapper[4925]: E0226 08:46:51.506945 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:51 crc kubenswrapper[4925]: E0226 08:46:51.507109 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:51 crc kubenswrapper[4925]: E0226 08:46:51.507299 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:53 crc kubenswrapper[4925]: I0226 08:46:53.507052 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:53 crc kubenswrapper[4925]: I0226 08:46:53.507083 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:53 crc kubenswrapper[4925]: I0226 08:46:53.507151 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:53 crc kubenswrapper[4925]: I0226 08:46:53.507237 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:53 crc kubenswrapper[4925]: E0226 08:46:53.507459 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:53 crc kubenswrapper[4925]: E0226 08:46:53.507635 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:53 crc kubenswrapper[4925]: E0226 08:46:53.507743 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:53 crc kubenswrapper[4925]: E0226 08:46:53.507800 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:53 crc kubenswrapper[4925]: E0226 08:46:53.627119 4925 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 08:46:55 crc kubenswrapper[4925]: I0226 08:46:55.506177 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:55 crc kubenswrapper[4925]: I0226 08:46:55.506252 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:55 crc kubenswrapper[4925]: I0226 08:46:55.506319 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:55 crc kubenswrapper[4925]: E0226 08:46:55.506315 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:55 crc kubenswrapper[4925]: I0226 08:46:55.506392 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:55 crc kubenswrapper[4925]: E0226 08:46:55.506506 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:55 crc kubenswrapper[4925]: E0226 08:46:55.506567 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:55 crc kubenswrapper[4925]: E0226 08:46:55.506634 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:57 crc kubenswrapper[4925]: I0226 08:46:57.506331 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:57 crc kubenswrapper[4925]: I0226 08:46:57.506700 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:57 crc kubenswrapper[4925]: I0226 08:46:57.506749 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:57 crc kubenswrapper[4925]: I0226 08:46:57.506671 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:57 crc kubenswrapper[4925]: E0226 08:46:57.506787 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:57 crc kubenswrapper[4925]: E0226 08:46:57.506924 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:57 crc kubenswrapper[4925]: E0226 08:46:57.507119 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:57 crc kubenswrapper[4925]: E0226 08:46:57.507217 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:46:58 crc kubenswrapper[4925]: I0226 08:46:58.521842 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://324fd3da098430c5f59f5fd46c539547214058e53623ca681063172d8f4cef9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:58 crc kubenswrapper[4925]: I0226 08:46:58.538078 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:58 crc kubenswrapper[4925]: I0226 08:46:58.556653 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://828ed8a437fbdc35034193d0949f366b76c71e07bee91f5c17e79deeda19f798\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23fa8cf662839e9fe72fdbd1c6ea5490d6590f116577a177c8e64bbc54c9b2c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:58 crc kubenswrapper[4925]: I0226 08:46:58.570706 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-96njb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w7bg6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-96njb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:58 crc kubenswrapper[4925]: I0226 08:46:58.594321 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fe9a195-8e21-4517-b680-da420bb30a5e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bd4a14b45912a61145d72aaa6b5e181253b68ba75d012acaa27caa7f4cf7460\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:45:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 08:45:40.228028 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 08:45:40.228154 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 08:45:40.228798 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1213763700/tls.crt::/tmp/serving-cert-1213763700/tls.key\\\\\\\"\\\\nI0226 08:45:40.413099 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 08:45:40.416875 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 08:45:40.416908 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 08:45:40.416938 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 08:45:40.416942 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 08:45:40.425279 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 08:45:40.425330 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425342 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 08:45:40.425351 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 08:45:40.425358 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 08:45:40.425364 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 08:45:40.425370 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 08:45:40.425824 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 08:45:40.426663 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:58 crc kubenswrapper[4925]: I0226 08:46:58.608572 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b1d2578-5ed8-471e-856e-5b016da3f236\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://990154303bba18671e0c8601e90ccb022863224f32cb653579f8f972de55f483\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f68ff0a305f7ccec9431dd57bd882f725363e4520a017db23c9d8b835b03a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c30bacae7ab1f7110329b058b5ea4ef0480906d0e83e320cd294610acf9c1b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b7f9b59d0aa26aadc6cc83fa9d4b84b865fe7780a61ba035626b0636ab577fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b7f9b59d0aa26aadc6cc83fa9d4b84b865fe7780a61ba035626b0636ab577fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:58 crc kubenswrapper[4925]: I0226 08:46:58.621226 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52bdf0f6-9afd-42fe-9a30-7da82bc7f619\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84a21c2fc8cf1209c21de4900e6c272a1433c1b127aef98fd5d038aa90e3c052\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sp4kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s6pcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:58 crc kubenswrapper[4925]: E0226 08:46:58.628134 4925 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 08:46:58 crc kubenswrapper[4925]: I0226 08:46:58.634389 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdea74e1-d37c-4aa3-861e-6d4eab3a870a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a93e3dd92c40b3ec0ae58bbd0e3465dd85f9c0d4b8f31edc8f51e2561317f02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37690dacd443d0a7bd5d3f41859044c7ff636aac70a6fce47219276f12aef674\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7vdxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6hpsj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:58 crc kubenswrapper[4925]: I0226 08:46:58.651338 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fmxxp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"03d05327-5621-49e8-8051-9dccd63f3a5b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:46:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bafcab86ecc3c83d9dfbd477b70d92f523243231043e280c7f37f9c8e5944a85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:46:33Z\\\",\\\"message\\\":\\\"2026-02-26T08:45:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b4b70cc5-d98c-40fd-b31f-4093c4dc2d53\\\\n2026-02-26T08:45:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b4b70cc5-d98c-40fd-b31f-4093c4dc2d53 to /host/opt/cni/bin/\\\\n2026-02-26T08:45:48Z [verbose] multus-daemon started\\\\n2026-02-26T08:45:48Z [verbose] Readiness Indicator file check\\\\n2026-02-26T08:46:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jrhqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fmxxp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:58 crc kubenswrapper[4925]: I0226 08:46:58.681231 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"671c7a2c-75d0-4322-a581-3d1b26bda633\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://170db708e79c5b0011921574a1aaacbbe816433d99662e0df517e9d09944c526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0800e2a24e3571125605c498dbc628968fc1a0ad2d148c6f64d62bb691e3ec76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ab569a41f217772effab28101a7a7f0ecf89973d36f2cea8ed21ba4336eef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce2281bdadef72a938ca9c5d850adb0997bb5da357edaa560e5deaca6ad4d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cbdcc6677bce28cba2900abac4240f44b8d3bf5abd65f4cc3bdcec77c3be23d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://656f3d0f5d3639aca397a68715e41226ddcdec0786494d1e375db35633ae90ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c7dc4395414e112ae29765029027afe8e453d9d9fadc956293c8452aa474f85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T08:46:11Z\\\",\\\"message\\\":\\\" 7057 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0226 08:46:11.381663 7057 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0226 08:46:11.381712 7057 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0226 08:46:11.381755 7057 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0226 08:46:11.381786 7057 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0226 08:46:11.381810 7057 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0226 08:46:11.381860 7057 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0226 08:46:11.381915 7057 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0226 08:46:11.381936 7057 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 08:46:11.381948 7057 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0226 08:46:11.381921 7057 handler.go:208] Removed *v1.Node event handler 2\\\\nI0226 08:46:11.381964 7057 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0226 08:46:11.381968 7057 handler.go:208] Removed *v1.Node event handler 7\\\\nI0226 08:46:11.381927 7057 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0226 08:46:11.382007 7057 factory.go:656] Stopping watch factory\\\\nI0226 08:46:11.382043 7057 ovnkube.go:599] Stopped ovnkube\\\\nI0226 08:46:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:46:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43b407b91fb4667e6f22ab0af6e741ef0e341feeb07fc8af8bd50607632db20f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://723521bc83d89438cee1aff78276142e0aac2ccdc72536cc00c711af9f97d366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-htmnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pv859\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:58 crc kubenswrapper[4925]: I0226 08:46:58.691702 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n8ts9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"700c9110-ceb1-4a4a-8fea-0fca5904ff5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c84eb252ab0c0c32c4647d6d701467ae9102bbf8de8d38bb1f427b9d16354931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwkvp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n8ts9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:58 crc kubenswrapper[4925]: I0226 08:46:58.704860 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:58 crc kubenswrapper[4925]: I0226 08:46:58.715598 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ztgdx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d171f4c-5b3c-498e-acee-1725e2921586\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9776065b066bffec937d56cfc534696ce31b6813d6a07020b9cfde6bffc331f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kjl2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ztgdx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:58 crc kubenswrapper[4925]: I0226 08:46:58.734001 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b9bb4d7-d949-4ead-b4d5-4a603104c6d2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11597156d0bd5fe891962baaac60089238f8ef371feffd647a62af174bb6dfbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aec7efa5468d3890f22dadb3185aa046a9de96f08c80d8a3ecaa0ea21c404dc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d5736abc0c618dc7ce8a2d18bba180888ba30d10c8648ba3d99309564717819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8f29be76c16486a46b39eabf82b3cf56b4b9060dd7893da06a3f1de3ab7728a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://389c22a47f6f43936c0e85d447fd31a331cd28cbf20cafaa5111f576b7b9445d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a6598532e716cfb0791d94c99c6c1169c328a52d6d4e919e022165c70f60248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ada6ef4c7044b42633b0b0fedf93bf16079a4ddfe11d16503fc7eef4bb14d29c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05aa1a5a9e85f37d1b27585f947228a98e5f3d75c65b99f1ef1835d21e620396\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:44:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:58 crc kubenswrapper[4925]: I0226 08:46:58.748607 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d9e1b68-1e45-437a-aedf-893164e03e6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:44:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81f36248f49499708f8ebd43d539ad4ddcbbc0b098a87af3ded6150b306caf82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf88c570a0cceaa05b873b25d7780e7344246be869f48af83108eada3351e411\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T08:44:53Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 08:44:30.772197 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 08:44:30.779873 1 observer_polling.go:159] Starting file observer\\\\nI0226 08:44:30.828121 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 08:44:30.834123 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0226 08:44:53.825355 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0226 08:44:53.825481 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:44:53Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://044a78201ffbc5765ab36a83077c152e1d0f0d1d05cd6407aab6851a06e3681b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd87d89fc17a251143df8b8fcef958a2d67a7e16d107bf027ecdcbc1865a0d01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c505167e891bafbbdc10082c6e18071e83368b0995c978ac646e943b3d9a5d6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:44:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:44:28Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:58 crc kubenswrapper[4925]: I0226 08:46:58.762737 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://85ec28dc6a10c9e16087d483ccdf6d4c7d81ca687a8408a30818fd6cc842cab0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:58 crc kubenswrapper[4925]: I0226 08:46:58.779071 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:58 crc kubenswrapper[4925]: I0226 08:46:58.796113 4925 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90c32fd9-f2da-4ba9-8cf9-703b6de14025\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T08:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5524499eeebfb8bc504bd721b1d7d480ffabe222b1c69f03e9323f4825c0ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T08:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6215bc537cd1f5cd8d9979bcec95b7456e121ba3db59ae222fe44276965a1ad8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://984ab720068268a3131e6e20c73706265d3e826c3e6cb2a6a90d2374fea59faa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://963bd1bbd44fac6d136fd0b52d665e790edc1eccb9bc4d3908294acf74165cae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d65f5e3513db851d7197f0f1958c812d107ee96df2bad3af8d681297618e84a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb07288d59e16c0832c1a932e45e1e0b284aca3c9646d0c679beeda40a4cd334\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://223ebab9df29afd433157691431a15eaa42942f396b51e43c618578cb1e87713\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T08:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T08:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rngdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T08:45:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-sz9t7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:46:58Z is after 2025-08-24T17:21:41Z" Feb 26 08:46:59 crc kubenswrapper[4925]: I0226 08:46:59.506368 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:46:59 crc kubenswrapper[4925]: I0226 08:46:59.506609 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:46:59 crc kubenswrapper[4925]: E0226 08:46:59.506755 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:46:59 crc kubenswrapper[4925]: I0226 08:46:59.506964 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:46:59 crc kubenswrapper[4925]: I0226 08:46:59.506991 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:46:59 crc kubenswrapper[4925]: E0226 08:46:59.507049 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:46:59 crc kubenswrapper[4925]: E0226 08:46:59.507517 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:46:59 crc kubenswrapper[4925]: E0226 08:46:59.507657 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.823631 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.824193 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.824272 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.824350 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.824427 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:47:00Z","lastTransitionTime":"2026-02-26T08:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:47:00 crc kubenswrapper[4925]: E0226 08:47:00.840080 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:47:00Z is after 2025-08-24T17:21:41Z" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.845414 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.845462 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.845475 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.845491 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.845503 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:47:00Z","lastTransitionTime":"2026-02-26T08:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:47:00 crc kubenswrapper[4925]: E0226 08:47:00.860385 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:47:00Z is after 2025-08-24T17:21:41Z" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.864110 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.864283 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.864419 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.864582 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.864720 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:47:00Z","lastTransitionTime":"2026-02-26T08:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:47:00 crc kubenswrapper[4925]: E0226 08:47:00.877626 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:47:00Z is after 2025-08-24T17:21:41Z" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.882281 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.882489 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.882642 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.882744 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.882830 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:47:00Z","lastTransitionTime":"2026-02-26T08:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:47:00 crc kubenswrapper[4925]: E0226 08:47:00.894581 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:47:00Z is after 2025-08-24T17:21:41Z" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.897991 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.898049 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.898066 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.898088 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:47:00 crc kubenswrapper[4925]: I0226 08:47:00.898104 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:47:00Z","lastTransitionTime":"2026-02-26T08:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:47:00 crc kubenswrapper[4925]: E0226 08:47:00.912406 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T08:47:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b7b6b86-360e-4b52-88cf-71d654bc9098\\\",\\\"systemUUID\\\":\\\"5bb49b2f-79fa-45bc-b6a0-b1c1133f1d9e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T08:47:00Z is after 2025-08-24T17:21:41Z" Feb 26 08:47:00 crc kubenswrapper[4925]: E0226 08:47:00.912725 4925 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 08:47:01 crc kubenswrapper[4925]: I0226 08:47:01.506850 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:01 crc kubenswrapper[4925]: I0226 08:47:01.506890 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:01 crc kubenswrapper[4925]: I0226 08:47:01.507084 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:01 crc kubenswrapper[4925]: I0226 08:47:01.507122 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:01 crc kubenswrapper[4925]: E0226 08:47:01.507189 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:01 crc kubenswrapper[4925]: E0226 08:47:01.507421 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:01 crc kubenswrapper[4925]: E0226 08:47:01.507701 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:01 crc kubenswrapper[4925]: E0226 08:47:01.507833 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:01 crc kubenswrapper[4925]: I0226 08:47:01.520315 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 26 08:47:03 crc kubenswrapper[4925]: I0226 08:47:03.289621 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pv859_671c7a2c-75d0-4322-a581-3d1b26bda633/ovnkube-controller/3.log" Feb 26 08:47:03 crc kubenswrapper[4925]: I0226 08:47:03.290266 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pv859_671c7a2c-75d0-4322-a581-3d1b26bda633/ovnkube-controller/2.log" Feb 26 08:47:03 crc kubenswrapper[4925]: I0226 08:47:03.293035 4925 generic.go:334] "Generic (PLEG): container finished" podID="671c7a2c-75d0-4322-a581-3d1b26bda633" containerID="2c7dc4395414e112ae29765029027afe8e453d9d9fadc956293c8452aa474f85" exitCode=1 Feb 26 08:47:03 crc kubenswrapper[4925]: I0226 08:47:03.293072 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" event={"ID":"671c7a2c-75d0-4322-a581-3d1b26bda633","Type":"ContainerDied","Data":"2c7dc4395414e112ae29765029027afe8e453d9d9fadc956293c8452aa474f85"} Feb 26 08:47:03 crc kubenswrapper[4925]: I0226 08:47:03.293105 4925 scope.go:117] "RemoveContainer" containerID="efa8224971b5baf7bfaf8cb69a7d36925d80dccbd5739acf526f28e8cacc24a4" Feb 26 08:47:03 crc kubenswrapper[4925]: I0226 08:47:03.294157 4925 scope.go:117] "RemoveContainer" containerID="2c7dc4395414e112ae29765029027afe8e453d9d9fadc956293c8452aa474f85" Feb 26 08:47:03 crc kubenswrapper[4925]: E0226 08:47:03.294395 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pv859_openshift-ovn-kubernetes(671c7a2c-75d0-4322-a581-3d1b26bda633)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" podUID="671c7a2c-75d0-4322-a581-3d1b26bda633" Feb 26 08:47:03 crc kubenswrapper[4925]: I0226 08:47:03.333836 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=52.33382085 podStartE2EDuration="52.33382085s" podCreationTimestamp="2026-02-26 08:46:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:03.332828126 +0000 UTC m=+155.472402429" watchObservedRunningTime="2026-02-26 08:47:03.33382085 +0000 UTC m=+155.473395133" Feb 26 08:47:03 crc kubenswrapper[4925]: I0226 08:47:03.362661 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=25.362645349 podStartE2EDuration="25.362645349s" podCreationTimestamp="2026-02-26 08:46:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:03.347683676 +0000 UTC m=+155.487257969" watchObservedRunningTime="2026-02-26 08:47:03.362645349 +0000 UTC m=+155.502219642" Feb 26 08:47:03 crc kubenswrapper[4925]: I0226 08:47:03.393675 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-sz9t7" podStartSLOduration=107.39365805 podStartE2EDuration="1m47.39365805s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:03.393614859 +0000 UTC m=+155.533189162" watchObservedRunningTime="2026-02-26 08:47:03.39365805 +0000 UTC m=+155.533232333" Feb 26 08:47:03 crc kubenswrapper[4925]: I0226 08:47:03.479931 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=75.479910939 podStartE2EDuration="1m15.479910939s" podCreationTimestamp="2026-02-26 08:45:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:03.479612641 +0000 UTC m=+155.619186944" watchObservedRunningTime="2026-02-26 08:47:03.479910939 +0000 UTC m=+155.619485222" Feb 26 08:47:03 crc kubenswrapper[4925]: I0226 08:47:03.501906 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=14.501890721 podStartE2EDuration="14.501890721s" podCreationTimestamp="2026-02-26 08:46:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:03.492136395 +0000 UTC m=+155.631710698" watchObservedRunningTime="2026-02-26 08:47:03.501890721 +0000 UTC m=+155.641465004" Feb 26 08:47:03 crc kubenswrapper[4925]: I0226 08:47:03.506458 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:03 crc kubenswrapper[4925]: E0226 08:47:03.506582 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:03 crc kubenswrapper[4925]: I0226 08:47:03.506465 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:03 crc kubenswrapper[4925]: E0226 08:47:03.506638 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:03 crc kubenswrapper[4925]: I0226 08:47:03.506475 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:03 crc kubenswrapper[4925]: E0226 08:47:03.506683 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:03 crc kubenswrapper[4925]: I0226 08:47:03.506458 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:03 crc kubenswrapper[4925]: E0226 08:47:03.506725 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:03 crc kubenswrapper[4925]: I0226 08:47:03.517218 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=2.517198182 podStartE2EDuration="2.517198182s" podCreationTimestamp="2026-02-26 08:47:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:03.502425214 +0000 UTC m=+155.641999497" watchObservedRunningTime="2026-02-26 08:47:03.517198182 +0000 UTC m=+155.656772475" Feb 26 08:47:03 crc kubenswrapper[4925]: I0226 08:47:03.517305 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" podStartSLOduration=108.517301634 podStartE2EDuration="1m48.517301634s" podCreationTimestamp="2026-02-26 08:45:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:03.516875184 +0000 UTC m=+155.656449487" watchObservedRunningTime="2026-02-26 08:47:03.517301634 +0000 UTC m=+155.656875927" Feb 26 08:47:03 crc kubenswrapper[4925]: I0226 08:47:03.533275 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6hpsj" podStartSLOduration=107.53325874 podStartE2EDuration="1m47.53325874s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:03.532677076 +0000 UTC m=+155.672251399" watchObservedRunningTime="2026-02-26 08:47:03.53325874 +0000 UTC m=+155.672833033" Feb 26 08:47:03 crc kubenswrapper[4925]: I0226 08:47:03.548780 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-fmxxp" podStartSLOduration=107.548753536 podStartE2EDuration="1m47.548753536s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:03.547623018 +0000 UTC m=+155.687197331" watchObservedRunningTime="2026-02-26 08:47:03.548753536 +0000 UTC m=+155.688327819" Feb 26 08:47:03 crc kubenswrapper[4925]: I0226 08:47:03.578830 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-n8ts9" podStartSLOduration=108.578812504 podStartE2EDuration="1m48.578812504s" podCreationTimestamp="2026-02-26 08:45:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:03.577742548 +0000 UTC m=+155.717316831" watchObservedRunningTime="2026-02-26 08:47:03.578812504 +0000 UTC m=+155.718386797" Feb 26 08:47:03 crc kubenswrapper[4925]: I0226 08:47:03.602280 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ztgdx" podStartSLOduration=108.602264482 podStartE2EDuration="1m48.602264482s" podCreationTimestamp="2026-02-26 08:45:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:03.601294958 +0000 UTC m=+155.740869261" watchObservedRunningTime="2026-02-26 08:47:03.602264482 +0000 UTC m=+155.741838765" Feb 26 08:47:03 crc kubenswrapper[4925]: E0226 08:47:03.629152 4925 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 08:47:04 crc kubenswrapper[4925]: I0226 08:47:04.299633 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pv859_671c7a2c-75d0-4322-a581-3d1b26bda633/ovnkube-controller/3.log" Feb 26 08:47:05 crc kubenswrapper[4925]: I0226 08:47:05.506764 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:05 crc kubenswrapper[4925]: I0226 08:47:05.506807 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:05 crc kubenswrapper[4925]: I0226 08:47:05.506764 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:05 crc kubenswrapper[4925]: E0226 08:47:05.506886 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:05 crc kubenswrapper[4925]: I0226 08:47:05.507023 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:05 crc kubenswrapper[4925]: E0226 08:47:05.507173 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:05 crc kubenswrapper[4925]: E0226 08:47:05.507355 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:05 crc kubenswrapper[4925]: E0226 08:47:05.507585 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:07 crc kubenswrapper[4925]: I0226 08:47:07.506084 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:07 crc kubenswrapper[4925]: I0226 08:47:07.506130 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:07 crc kubenswrapper[4925]: E0226 08:47:07.506219 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:07 crc kubenswrapper[4925]: I0226 08:47:07.506093 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:07 crc kubenswrapper[4925]: E0226 08:47:07.506326 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:07 crc kubenswrapper[4925]: I0226 08:47:07.506362 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:07 crc kubenswrapper[4925]: E0226 08:47:07.506433 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:07 crc kubenswrapper[4925]: E0226 08:47:07.506725 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:08 crc kubenswrapper[4925]: E0226 08:47:08.629793 4925 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 08:47:09 crc kubenswrapper[4925]: I0226 08:47:09.506064 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:09 crc kubenswrapper[4925]: E0226 08:47:09.506278 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:09 crc kubenswrapper[4925]: I0226 08:47:09.506681 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:09 crc kubenswrapper[4925]: I0226 08:47:09.506780 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:09 crc kubenswrapper[4925]: I0226 08:47:09.506807 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:09 crc kubenswrapper[4925]: E0226 08:47:09.507301 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:09 crc kubenswrapper[4925]: E0226 08:47:09.507417 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:09 crc kubenswrapper[4925]: E0226 08:47:09.507485 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.075604 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.075700 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.075719 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.075743 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.075762 4925 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T08:47:11Z","lastTransitionTime":"2026-02-26T08:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.142795 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-648bk"] Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.143463 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-648bk" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.146232 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.146620 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.146677 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.148029 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.231138 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b3bfd17-9d78-4cbb-8261-c856015273e6-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-648bk\" (UID: \"3b3bfd17-9d78-4cbb-8261-c856015273e6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-648bk" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.231645 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3b3bfd17-9d78-4cbb-8261-c856015273e6-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-648bk\" (UID: \"3b3bfd17-9d78-4cbb-8261-c856015273e6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-648bk" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.231771 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3b3bfd17-9d78-4cbb-8261-c856015273e6-service-ca\") pod \"cluster-version-operator-5c965bbfc6-648bk\" (UID: \"3b3bfd17-9d78-4cbb-8261-c856015273e6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-648bk" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.231899 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b3bfd17-9d78-4cbb-8261-c856015273e6-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-648bk\" (UID: \"3b3bfd17-9d78-4cbb-8261-c856015273e6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-648bk" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.231985 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3b3bfd17-9d78-4cbb-8261-c856015273e6-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-648bk\" (UID: \"3b3bfd17-9d78-4cbb-8261-c856015273e6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-648bk" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.332772 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3b3bfd17-9d78-4cbb-8261-c856015273e6-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-648bk\" (UID: \"3b3bfd17-9d78-4cbb-8261-c856015273e6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-648bk" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.333251 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3b3bfd17-9d78-4cbb-8261-c856015273e6-service-ca\") pod \"cluster-version-operator-5c965bbfc6-648bk\" (UID: \"3b3bfd17-9d78-4cbb-8261-c856015273e6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-648bk" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.332872 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3b3bfd17-9d78-4cbb-8261-c856015273e6-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-648bk\" (UID: \"3b3bfd17-9d78-4cbb-8261-c856015273e6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-648bk" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.333422 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b3bfd17-9d78-4cbb-8261-c856015273e6-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-648bk\" (UID: \"3b3bfd17-9d78-4cbb-8261-c856015273e6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-648bk" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.333580 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3b3bfd17-9d78-4cbb-8261-c856015273e6-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-648bk\" (UID: \"3b3bfd17-9d78-4cbb-8261-c856015273e6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-648bk" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.333701 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b3bfd17-9d78-4cbb-8261-c856015273e6-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-648bk\" (UID: \"3b3bfd17-9d78-4cbb-8261-c856015273e6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-648bk" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.333721 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3b3bfd17-9d78-4cbb-8261-c856015273e6-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-648bk\" (UID: \"3b3bfd17-9d78-4cbb-8261-c856015273e6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-648bk" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.334408 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3b3bfd17-9d78-4cbb-8261-c856015273e6-service-ca\") pod \"cluster-version-operator-5c965bbfc6-648bk\" (UID: \"3b3bfd17-9d78-4cbb-8261-c856015273e6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-648bk" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.349449 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b3bfd17-9d78-4cbb-8261-c856015273e6-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-648bk\" (UID: \"3b3bfd17-9d78-4cbb-8261-c856015273e6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-648bk" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.352973 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b3bfd17-9d78-4cbb-8261-c856015273e6-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-648bk\" (UID: \"3b3bfd17-9d78-4cbb-8261-c856015273e6\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-648bk" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.461428 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-648bk" Feb 26 08:47:11 crc kubenswrapper[4925]: W0226 08:47:11.478691 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b3bfd17_9d78_4cbb_8261_c856015273e6.slice/crio-825b4b20fc795aa2452fd7e8b19f7a7c11da8fd4764fcf6564cdc36a1ba817f0 WatchSource:0}: Error finding container 825b4b20fc795aa2452fd7e8b19f7a7c11da8fd4764fcf6564cdc36a1ba817f0: Status 404 returned error can't find the container with id 825b4b20fc795aa2452fd7e8b19f7a7c11da8fd4764fcf6564cdc36a1ba817f0 Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.506545 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.506570 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:11 crc kubenswrapper[4925]: E0226 08:47:11.506673 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.506604 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:11 crc kubenswrapper[4925]: E0226 08:47:11.506812 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:11 crc kubenswrapper[4925]: E0226 08:47:11.506936 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.509358 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:11 crc kubenswrapper[4925]: E0226 08:47:11.509655 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.632933 4925 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 26 08:47:11 crc kubenswrapper[4925]: I0226 08:47:11.644942 4925 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 26 08:47:12 crc kubenswrapper[4925]: I0226 08:47:12.331900 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-648bk" event={"ID":"3b3bfd17-9d78-4cbb-8261-c856015273e6","Type":"ContainerStarted","Data":"5b1a5bf0d94f26a6ba601a670385e3163bad2a28cc23212ace31c0926002ad89"} Feb 26 08:47:12 crc kubenswrapper[4925]: I0226 08:47:12.331957 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-648bk" event={"ID":"3b3bfd17-9d78-4cbb-8261-c856015273e6","Type":"ContainerStarted","Data":"825b4b20fc795aa2452fd7e8b19f7a7c11da8fd4764fcf6564cdc36a1ba817f0"} Feb 26 08:47:12 crc kubenswrapper[4925]: I0226 08:47:12.348933 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-648bk" podStartSLOduration=116.348915408 podStartE2EDuration="1m56.348915408s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:12.348731624 +0000 UTC m=+164.488305927" watchObservedRunningTime="2026-02-26 08:47:12.348915408 +0000 UTC m=+164.488489701" Feb 26 08:47:13 crc kubenswrapper[4925]: I0226 08:47:13.507077 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:13 crc kubenswrapper[4925]: I0226 08:47:13.507077 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:13 crc kubenswrapper[4925]: I0226 08:47:13.507111 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:13 crc kubenswrapper[4925]: E0226 08:47:13.507276 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:13 crc kubenswrapper[4925]: E0226 08:47:13.507706 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:13 crc kubenswrapper[4925]: E0226 08:47:13.507898 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:13 crc kubenswrapper[4925]: I0226 08:47:13.508236 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:13 crc kubenswrapper[4925]: E0226 08:47:13.508462 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:13 crc kubenswrapper[4925]: E0226 08:47:13.631828 4925 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 08:47:15 crc kubenswrapper[4925]: I0226 08:47:15.506556 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:15 crc kubenswrapper[4925]: I0226 08:47:15.506657 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:15 crc kubenswrapper[4925]: E0226 08:47:15.506729 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:15 crc kubenswrapper[4925]: E0226 08:47:15.506829 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:15 crc kubenswrapper[4925]: I0226 08:47:15.507209 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:15 crc kubenswrapper[4925]: I0226 08:47:15.507226 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:15 crc kubenswrapper[4925]: E0226 08:47:15.508735 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:15 crc kubenswrapper[4925]: I0226 08:47:15.509760 4925 scope.go:117] "RemoveContainer" containerID="2c7dc4395414e112ae29765029027afe8e453d9d9fadc956293c8452aa474f85" Feb 26 08:47:15 crc kubenswrapper[4925]: E0226 08:47:15.509783 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:15 crc kubenswrapper[4925]: E0226 08:47:15.510008 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pv859_openshift-ovn-kubernetes(671c7a2c-75d0-4322-a581-3d1b26bda633)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" podUID="671c7a2c-75d0-4322-a581-3d1b26bda633" Feb 26 08:47:17 crc kubenswrapper[4925]: I0226 08:47:17.506327 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:17 crc kubenswrapper[4925]: I0226 08:47:17.506403 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:17 crc kubenswrapper[4925]: E0226 08:47:17.506472 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:17 crc kubenswrapper[4925]: E0226 08:47:17.506639 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:17 crc kubenswrapper[4925]: I0226 08:47:17.506703 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:17 crc kubenswrapper[4925]: I0226 08:47:17.506751 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:17 crc kubenswrapper[4925]: E0226 08:47:17.506784 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:17 crc kubenswrapper[4925]: E0226 08:47:17.506913 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:18 crc kubenswrapper[4925]: E0226 08:47:18.632883 4925 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 08:47:19 crc kubenswrapper[4925]: I0226 08:47:19.506602 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:19 crc kubenswrapper[4925]: I0226 08:47:19.506653 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:19 crc kubenswrapper[4925]: E0226 08:47:19.506751 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:19 crc kubenswrapper[4925]: I0226 08:47:19.506811 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:19 crc kubenswrapper[4925]: I0226 08:47:19.506920 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:19 crc kubenswrapper[4925]: E0226 08:47:19.507018 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:19 crc kubenswrapper[4925]: E0226 08:47:19.507223 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:19 crc kubenswrapper[4925]: E0226 08:47:19.507343 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:20 crc kubenswrapper[4925]: I0226 08:47:20.356581 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fmxxp_03d05327-5621-49e8-8051-9dccd63f3a5b/kube-multus/1.log" Feb 26 08:47:20 crc kubenswrapper[4925]: I0226 08:47:20.357392 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fmxxp_03d05327-5621-49e8-8051-9dccd63f3a5b/kube-multus/0.log" Feb 26 08:47:20 crc kubenswrapper[4925]: I0226 08:47:20.357445 4925 generic.go:334] "Generic (PLEG): container finished" podID="03d05327-5621-49e8-8051-9dccd63f3a5b" containerID="bafcab86ecc3c83d9dfbd477b70d92f523243231043e280c7f37f9c8e5944a85" exitCode=1 Feb 26 08:47:20 crc kubenswrapper[4925]: I0226 08:47:20.357502 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fmxxp" event={"ID":"03d05327-5621-49e8-8051-9dccd63f3a5b","Type":"ContainerDied","Data":"bafcab86ecc3c83d9dfbd477b70d92f523243231043e280c7f37f9c8e5944a85"} Feb 26 08:47:20 crc kubenswrapper[4925]: I0226 08:47:20.357618 4925 scope.go:117] "RemoveContainer" containerID="5ee1527f4fcda3bc0dc6dd10652495cac6291448ed0aebef1f5854d5780ef63b" Feb 26 08:47:20 crc kubenswrapper[4925]: I0226 08:47:20.358104 4925 scope.go:117] "RemoveContainer" containerID="bafcab86ecc3c83d9dfbd477b70d92f523243231043e280c7f37f9c8e5944a85" Feb 26 08:47:20 crc kubenswrapper[4925]: E0226 08:47:20.358357 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-fmxxp_openshift-multus(03d05327-5621-49e8-8051-9dccd63f3a5b)\"" pod="openshift-multus/multus-fmxxp" podUID="03d05327-5621-49e8-8051-9dccd63f3a5b" Feb 26 08:47:21 crc kubenswrapper[4925]: I0226 08:47:21.363028 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fmxxp_03d05327-5621-49e8-8051-9dccd63f3a5b/kube-multus/1.log" Feb 26 08:47:21 crc kubenswrapper[4925]: I0226 08:47:21.506852 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:21 crc kubenswrapper[4925]: I0226 08:47:21.506908 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:21 crc kubenswrapper[4925]: I0226 08:47:21.506923 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:21 crc kubenswrapper[4925]: I0226 08:47:21.506887 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:21 crc kubenswrapper[4925]: E0226 08:47:21.507052 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:21 crc kubenswrapper[4925]: E0226 08:47:21.507186 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:21 crc kubenswrapper[4925]: E0226 08:47:21.507220 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:21 crc kubenswrapper[4925]: E0226 08:47:21.507293 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:23 crc kubenswrapper[4925]: I0226 08:47:23.506968 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:23 crc kubenswrapper[4925]: I0226 08:47:23.507023 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:23 crc kubenswrapper[4925]: I0226 08:47:23.507005 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:23 crc kubenswrapper[4925]: I0226 08:47:23.506968 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:23 crc kubenswrapper[4925]: E0226 08:47:23.507199 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:23 crc kubenswrapper[4925]: E0226 08:47:23.507306 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:23 crc kubenswrapper[4925]: E0226 08:47:23.507437 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:23 crc kubenswrapper[4925]: E0226 08:47:23.507633 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:23 crc kubenswrapper[4925]: E0226 08:47:23.634890 4925 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 08:47:25 crc kubenswrapper[4925]: I0226 08:47:25.506669 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:25 crc kubenswrapper[4925]: I0226 08:47:25.506699 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:25 crc kubenswrapper[4925]: I0226 08:47:25.506757 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:25 crc kubenswrapper[4925]: E0226 08:47:25.506799 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:25 crc kubenswrapper[4925]: I0226 08:47:25.506669 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:25 crc kubenswrapper[4925]: E0226 08:47:25.506982 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:25 crc kubenswrapper[4925]: E0226 08:47:25.507004 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:25 crc kubenswrapper[4925]: E0226 08:47:25.507042 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:27 crc kubenswrapper[4925]: I0226 08:47:27.506968 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:27 crc kubenswrapper[4925]: I0226 08:47:27.507096 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:27 crc kubenswrapper[4925]: E0226 08:47:27.507117 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:27 crc kubenswrapper[4925]: I0226 08:47:27.507239 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:27 crc kubenswrapper[4925]: I0226 08:47:27.507232 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:27 crc kubenswrapper[4925]: E0226 08:47:27.507405 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:27 crc kubenswrapper[4925]: E0226 08:47:27.507636 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:27 crc kubenswrapper[4925]: E0226 08:47:27.507745 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:28 crc kubenswrapper[4925]: E0226 08:47:28.635503 4925 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 08:47:29 crc kubenswrapper[4925]: I0226 08:47:29.506164 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:29 crc kubenswrapper[4925]: E0226 08:47:29.506327 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:29 crc kubenswrapper[4925]: I0226 08:47:29.506188 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:29 crc kubenswrapper[4925]: I0226 08:47:29.506164 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:29 crc kubenswrapper[4925]: E0226 08:47:29.506394 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:29 crc kubenswrapper[4925]: I0226 08:47:29.506212 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:29 crc kubenswrapper[4925]: E0226 08:47:29.507991 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:29 crc kubenswrapper[4925]: E0226 08:47:29.508579 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:29 crc kubenswrapper[4925]: I0226 08:47:29.509654 4925 scope.go:117] "RemoveContainer" containerID="2c7dc4395414e112ae29765029027afe8e453d9d9fadc956293c8452aa474f85" Feb 26 08:47:29 crc kubenswrapper[4925]: E0226 08:47:29.510081 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pv859_openshift-ovn-kubernetes(671c7a2c-75d0-4322-a581-3d1b26bda633)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" podUID="671c7a2c-75d0-4322-a581-3d1b26bda633" Feb 26 08:47:31 crc kubenswrapper[4925]: I0226 08:47:31.505942 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:31 crc kubenswrapper[4925]: I0226 08:47:31.505995 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:31 crc kubenswrapper[4925]: I0226 08:47:31.506020 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:31 crc kubenswrapper[4925]: E0226 08:47:31.506071 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:31 crc kubenswrapper[4925]: I0226 08:47:31.506099 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:31 crc kubenswrapper[4925]: E0226 08:47:31.506189 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:31 crc kubenswrapper[4925]: E0226 08:47:31.506307 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:31 crc kubenswrapper[4925]: E0226 08:47:31.506339 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:33 crc kubenswrapper[4925]: I0226 08:47:33.506558 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:33 crc kubenswrapper[4925]: I0226 08:47:33.506615 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:33 crc kubenswrapper[4925]: I0226 08:47:33.506674 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:33 crc kubenswrapper[4925]: E0226 08:47:33.506763 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:33 crc kubenswrapper[4925]: I0226 08:47:33.506787 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:33 crc kubenswrapper[4925]: E0226 08:47:33.507032 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:33 crc kubenswrapper[4925]: E0226 08:47:33.507075 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:33 crc kubenswrapper[4925]: E0226 08:47:33.507271 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:33 crc kubenswrapper[4925]: I0226 08:47:33.508642 4925 scope.go:117] "RemoveContainer" containerID="bafcab86ecc3c83d9dfbd477b70d92f523243231043e280c7f37f9c8e5944a85" Feb 26 08:47:33 crc kubenswrapper[4925]: E0226 08:47:33.636296 4925 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 08:47:34 crc kubenswrapper[4925]: I0226 08:47:34.405690 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fmxxp_03d05327-5621-49e8-8051-9dccd63f3a5b/kube-multus/1.log" Feb 26 08:47:34 crc kubenswrapper[4925]: I0226 08:47:34.405763 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fmxxp" event={"ID":"03d05327-5621-49e8-8051-9dccd63f3a5b","Type":"ContainerStarted","Data":"a2f765c232ce95b4b5ca3b1b22bcb285e6f9dc8d026da41ec841fb16cc9eb6c2"} Feb 26 08:47:35 crc kubenswrapper[4925]: I0226 08:47:35.506831 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:35 crc kubenswrapper[4925]: I0226 08:47:35.506845 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:35 crc kubenswrapper[4925]: E0226 08:47:35.506971 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:35 crc kubenswrapper[4925]: I0226 08:47:35.506994 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:35 crc kubenswrapper[4925]: I0226 08:47:35.507011 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:35 crc kubenswrapper[4925]: E0226 08:47:35.507162 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:35 crc kubenswrapper[4925]: E0226 08:47:35.507433 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:35 crc kubenswrapper[4925]: E0226 08:47:35.507512 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:37 crc kubenswrapper[4925]: I0226 08:47:37.506508 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:37 crc kubenswrapper[4925]: I0226 08:47:37.506759 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:37 crc kubenswrapper[4925]: I0226 08:47:37.506638 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:37 crc kubenswrapper[4925]: I0226 08:47:37.506607 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:37 crc kubenswrapper[4925]: E0226 08:47:37.506889 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:37 crc kubenswrapper[4925]: E0226 08:47:37.507080 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:37 crc kubenswrapper[4925]: E0226 08:47:37.507185 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:37 crc kubenswrapper[4925]: E0226 08:47:37.507314 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:38 crc kubenswrapper[4925]: E0226 08:47:38.637412 4925 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 08:47:39 crc kubenswrapper[4925]: I0226 08:47:39.506798 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:39 crc kubenswrapper[4925]: E0226 08:47:39.507046 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:39 crc kubenswrapper[4925]: I0226 08:47:39.506868 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:39 crc kubenswrapper[4925]: E0226 08:47:39.507151 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:39 crc kubenswrapper[4925]: I0226 08:47:39.506895 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:39 crc kubenswrapper[4925]: E0226 08:47:39.507226 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:39 crc kubenswrapper[4925]: I0226 08:47:39.506836 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:39 crc kubenswrapper[4925]: E0226 08:47:39.507318 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:41 crc kubenswrapper[4925]: I0226 08:47:41.506938 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:41 crc kubenswrapper[4925]: I0226 08:47:41.506971 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:41 crc kubenswrapper[4925]: I0226 08:47:41.506939 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:41 crc kubenswrapper[4925]: E0226 08:47:41.507072 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:41 crc kubenswrapper[4925]: I0226 08:47:41.506952 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:41 crc kubenswrapper[4925]: E0226 08:47:41.507282 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:41 crc kubenswrapper[4925]: E0226 08:47:41.507462 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:41 crc kubenswrapper[4925]: E0226 08:47:41.507577 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:43 crc kubenswrapper[4925]: I0226 08:47:43.506223 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:43 crc kubenswrapper[4925]: I0226 08:47:43.506284 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:43 crc kubenswrapper[4925]: I0226 08:47:43.506249 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:43 crc kubenswrapper[4925]: I0226 08:47:43.506240 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:43 crc kubenswrapper[4925]: E0226 08:47:43.506506 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:43 crc kubenswrapper[4925]: E0226 08:47:43.506602 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:43 crc kubenswrapper[4925]: E0226 08:47:43.506657 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:43 crc kubenswrapper[4925]: E0226 08:47:43.506774 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:43 crc kubenswrapper[4925]: E0226 08:47:43.638501 4925 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 08:47:44 crc kubenswrapper[4925]: I0226 08:47:44.508624 4925 scope.go:117] "RemoveContainer" containerID="2c7dc4395414e112ae29765029027afe8e453d9d9fadc956293c8452aa474f85" Feb 26 08:47:45 crc kubenswrapper[4925]: I0226 08:47:45.447965 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pv859_671c7a2c-75d0-4322-a581-3d1b26bda633/ovnkube-controller/3.log" Feb 26 08:47:45 crc kubenswrapper[4925]: I0226 08:47:45.450220 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" event={"ID":"671c7a2c-75d0-4322-a581-3d1b26bda633","Type":"ContainerStarted","Data":"c5fc493a7aca1795422247839e87d77f47c43347df7adfb35cb724bab6d12888"} Feb 26 08:47:45 crc kubenswrapper[4925]: I0226 08:47:45.450793 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:47:45 crc kubenswrapper[4925]: I0226 08:47:45.506812 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:45 crc kubenswrapper[4925]: E0226 08:47:45.506950 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:45 crc kubenswrapper[4925]: I0226 08:47:45.507164 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:45 crc kubenswrapper[4925]: E0226 08:47:45.507209 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:45 crc kubenswrapper[4925]: I0226 08:47:45.507313 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:45 crc kubenswrapper[4925]: E0226 08:47:45.507356 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:45 crc kubenswrapper[4925]: I0226 08:47:45.507452 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:45 crc kubenswrapper[4925]: E0226 08:47:45.507495 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:45 crc kubenswrapper[4925]: I0226 08:47:45.695682 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" podStartSLOduration=149.695658923 podStartE2EDuration="2m29.695658923s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:45.486513784 +0000 UTC m=+197.626088087" watchObservedRunningTime="2026-02-26 08:47:45.695658923 +0000 UTC m=+197.835233226" Feb 26 08:47:45 crc kubenswrapper[4925]: I0226 08:47:45.696034 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-96njb"] Feb 26 08:47:46 crc kubenswrapper[4925]: I0226 08:47:46.453666 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:46 crc kubenswrapper[4925]: E0226 08:47:46.454370 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:47 crc kubenswrapper[4925]: I0226 08:47:47.506959 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:47 crc kubenswrapper[4925]: I0226 08:47:47.506959 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:47 crc kubenswrapper[4925]: E0226 08:47:47.507194 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 08:47:47 crc kubenswrapper[4925]: I0226 08:47:47.506996 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:47 crc kubenswrapper[4925]: I0226 08:47:47.506983 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:47 crc kubenswrapper[4925]: E0226 08:47:47.507303 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 08:47:47 crc kubenswrapper[4925]: E0226 08:47:47.507442 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 08:47:47 crc kubenswrapper[4925]: E0226 08:47:47.507580 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96njb" podUID="c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc" Feb 26 08:47:49 crc kubenswrapper[4925]: I0226 08:47:49.506613 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:49 crc kubenswrapper[4925]: I0226 08:47:49.506703 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:49 crc kubenswrapper[4925]: I0226 08:47:49.506809 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:49 crc kubenswrapper[4925]: I0226 08:47:49.506846 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:49 crc kubenswrapper[4925]: I0226 08:47:49.513124 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 26 08:47:49 crc kubenswrapper[4925]: I0226 08:47:49.515459 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 26 08:47:49 crc kubenswrapper[4925]: I0226 08:47:49.515483 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 26 08:47:49 crc kubenswrapper[4925]: I0226 08:47:49.515574 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 26 08:47:49 crc kubenswrapper[4925]: I0226 08:47:49.515459 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 26 08:47:49 crc kubenswrapper[4925]: I0226 08:47:49.515508 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.033287 4925 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.087356 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nkgtm"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.087847 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8hj5v"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.088029 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.088368 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.089955 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.090175 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.093462 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gl96d"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.093918 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bpqx7"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.094272 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bpqx7" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.094452 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gl96d" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.095079 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.095631 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-djlxf"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.096114 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-djlxf" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.096452 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdr9z"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.096626 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.097310 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdr9z" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.099319 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.099451 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.099693 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.099834 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.099920 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.099994 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.100118 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.099334 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.099375 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.099960 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.100762 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-chlnd"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.108202 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.108885 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.111291 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.112043 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.112171 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.113155 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.113583 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.130843 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.131056 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.131578 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6tqj"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.131978 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.132214 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6tqj" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.132333 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.132607 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pxx78"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.132368 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chlnd" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.133145 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.133333 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.133427 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-pxx78" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.134029 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-kv2xl"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.134577 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.134622 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.134913 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.136043 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.136654 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.136718 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.136944 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.138167 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.138599 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.138746 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.138843 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.138940 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.139024 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.140092 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.140288 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.140402 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.140577 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.140617 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.140726 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.140816 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.140909 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.140952 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.141194 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.141350 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.141404 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.141456 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.141211 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.141619 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.141656 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-f5g2n"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.141716 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.141917 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.142097 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.142211 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.142407 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f5g2n" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.142796 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-btcxz"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.143867 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.144265 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.145543 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-t62dr"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.145956 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-t62dr" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.147489 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.149386 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.149826 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.149957 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mzkdl"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.150021 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.150166 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.150490 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.150823 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mzkdl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.151947 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.152603 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.152855 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.152963 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.152866 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.153091 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.153205 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.154076 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-b5l7b"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.154797 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-b5l7b" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.155298 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nsvcl"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.156075 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nsvcl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.156748 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6mjmn"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.157412 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6mjmn" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.158631 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jwb2"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.159298 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jwb2" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.160488 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.160786 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-94fn2"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.161212 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-94fn2" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.161577 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rrzqs"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.184724 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.186414 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-h5g2f"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.186995 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tldwj"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.191948 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.194343 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-h5g2f" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.194888 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tv47r"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.198162 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tldwj" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.198312 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tv47r" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.206255 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.206700 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.206931 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.208653 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60428194-e5a3-4578-872a-6d564e7b0a7c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9jwb2\" (UID: \"60428194-e5a3-4578-872a-6d564e7b0a7c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jwb2" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.208809 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-client-ca\") pod \"route-controller-manager-6576b87f9c-pv5bb\" (UID: \"60dcb438-eb57-425d-a9a0-c1dfe9a884e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.208900 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.246669 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.246808 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.247045 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.247456 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.248131 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.247692 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-48gkw"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.247594 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.247725 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.248487 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.247739 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.248592 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/66d0db3d-0908-48d2-be73-e1afd790f010-audit\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.247802 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.247844 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.247905 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.247945 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.247989 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.248015 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.248995 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4pmw2"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249365 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3241c5-45b5-464d-866b-690bcedf1934-trusted-ca-bundle\") pod \"console-f9d7485db-kv2xl\" (UID: \"7c3241c5-45b5-464d-866b-690bcedf1934\") " pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249417 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66d0db3d-0908-48d2-be73-e1afd790f010-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249437 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwdzt\" (UniqueName: \"kubernetes.io/projected/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-kube-api-access-mwdzt\") pod \"route-controller-manager-6576b87f9c-pv5bb\" (UID: \"60dcb438-eb57-425d-a9a0-c1dfe9a884e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249454 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjpvm\" (UniqueName: \"kubernetes.io/projected/43219832-a329-442e-8bb7-200a4deff15d-kube-api-access-zjpvm\") pod \"control-plane-machine-set-operator-78cbb6b69f-94fn2\" (UID: \"43219832-a329-442e-8bb7-200a4deff15d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-94fn2" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249471 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/60428194-e5a3-4578-872a-6d564e7b0a7c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9jwb2\" (UID: \"60428194-e5a3-4578-872a-6d564e7b0a7c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jwb2" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249487 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9a7e81a4-3454-4467-8355-9a19f0274073-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gl96d\" (UID: \"9a7e81a4-3454-4467-8355-9a19f0274073\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gl96d" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249502 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66d0db3d-0908-48d2-be73-e1afd790f010-etcd-client\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249518 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a7e81a4-3454-4467-8355-9a19f0274073-serving-cert\") pod \"openshift-config-operator-7777fb866f-gl96d\" (UID: \"9a7e81a4-3454-4467-8355-9a19f0274073\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gl96d" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249551 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8cn5f"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249562 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/43219832-a329-442e-8bb7-200a4deff15d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-94fn2\" (UID: \"43219832-a329-442e-8bb7-200a4deff15d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-94fn2" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249580 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66d0db3d-0908-48d2-be73-e1afd790f010-audit-dir\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249595 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-serving-cert\") pod \"route-controller-manager-6576b87f9c-pv5bb\" (UID: \"60dcb438-eb57-425d-a9a0-c1dfe9a884e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249620 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn78j\" (UniqueName: \"kubernetes.io/projected/cd116d5f-0452-44f0-ac17-2a0e2e083b93-kube-api-access-wn78j\") pod \"downloads-7954f5f757-b5l7b\" (UID: \"cd116d5f-0452-44f0-ac17-2a0e2e083b93\") " pod="openshift-console/downloads-7954f5f757-b5l7b" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249639 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sdch\" (UniqueName: \"kubernetes.io/projected/3423b879-0617-45e9-a197-11d3c67141b7-kube-api-access-8sdch\") pod \"cluster-samples-operator-665b6dd947-gdr9z\" (UID: \"3423b879-0617-45e9-a197-11d3c67141b7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdr9z" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249655 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/66d0db3d-0908-48d2-be73-e1afd790f010-image-import-ca\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249683 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66d0db3d-0908-48d2-be73-e1afd790f010-serving-cert\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249659 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-48gkw" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249725 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3423b879-0617-45e9-a197-11d3c67141b7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gdr9z\" (UID: \"3423b879-0617-45e9-a197-11d3c67141b7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdr9z" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249751 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7c3241c5-45b5-464d-866b-690bcedf1934-oauth-serving-cert\") pod \"console-f9d7485db-kv2xl\" (UID: \"7c3241c5-45b5-464d-866b-690bcedf1934\") " pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249768 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngqk7\" (UniqueName: \"kubernetes.io/projected/7c3241c5-45b5-464d-866b-690bcedf1934-kube-api-access-ngqk7\") pod \"console-f9d7485db-kv2xl\" (UID: \"7c3241c5-45b5-464d-866b-690bcedf1934\") " pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249786 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60428194-e5a3-4578-872a-6d564e7b0a7c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9jwb2\" (UID: \"60428194-e5a3-4578-872a-6d564e7b0a7c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jwb2" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249801 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvkts\" (UniqueName: \"kubernetes.io/projected/60428194-e5a3-4578-872a-6d564e7b0a7c-kube-api-access-rvkts\") pod \"cluster-image-registry-operator-dc59b4c8b-9jwb2\" (UID: \"60428194-e5a3-4578-872a-6d564e7b0a7c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jwb2" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249823 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66d0db3d-0908-48d2-be73-e1afd790f010-etcd-serving-ca\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249838 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7c3241c5-45b5-464d-866b-690bcedf1934-service-ca\") pod \"console-f9d7485db-kv2xl\" (UID: \"7c3241c5-45b5-464d-866b-690bcedf1934\") " pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249853 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sctqn\" (UniqueName: \"kubernetes.io/projected/9a7e81a4-3454-4467-8355-9a19f0274073-kube-api-access-sctqn\") pod \"openshift-config-operator-7777fb866f-gl96d\" (UID: \"9a7e81a4-3454-4467-8355-9a19f0274073\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gl96d" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249868 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-config\") pod \"route-controller-manager-6576b87f9c-pv5bb\" (UID: \"60dcb438-eb57-425d-a9a0-c1dfe9a884e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249888 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66d0db3d-0908-48d2-be73-e1afd790f010-config\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249903 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66d0db3d-0908-48d2-be73-e1afd790f010-encryption-config\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249919 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/66d0db3d-0908-48d2-be73-e1afd790f010-node-pullsecrets\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249935 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7c3241c5-45b5-464d-866b-690bcedf1934-console-config\") pod \"console-f9d7485db-kv2xl\" (UID: \"7c3241c5-45b5-464d-866b-690bcedf1934\") " pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249961 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7c3241c5-45b5-464d-866b-690bcedf1934-console-oauth-config\") pod \"console-f9d7485db-kv2xl\" (UID: \"7c3241c5-45b5-464d-866b-690bcedf1934\") " pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249978 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwfkx\" (UniqueName: \"kubernetes.io/projected/66d0db3d-0908-48d2-be73-e1afd790f010-kube-api-access-bwfkx\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.249996 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c3241c5-45b5-464d-866b-690bcedf1934-console-serving-cert\") pod \"console-f9d7485db-kv2xl\" (UID: \"7c3241c5-45b5-464d-866b-690bcedf1934\") " pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.250135 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4pmw2" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.250269 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8cn5f" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.250874 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.250980 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.251017 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.251155 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.251174 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.251274 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.251330 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.251375 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.252723 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6nml7"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.253388 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.254236 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-c982t"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.255466 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-c982t" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.258705 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dd52h"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.263851 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k78cx"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.265070 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-k2dxr"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.264467 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dd52h" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.273425 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k2dxr" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.273474 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k78cx" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.273712 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-h4mp7"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.274436 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8hj5v"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.274568 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-h4mp7" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.276822 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.277007 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.277139 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.277159 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.277029 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.278591 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.288036 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.292144 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.295039 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.295240 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.297595 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fgfsf"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.298193 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nkgtm"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.298214 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.298285 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fgfsf" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.313581 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.314158 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.314846 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.315328 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.316779 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4w7fl"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.319966 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29534925-7hrlx"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.321990 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4w7fl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.323492 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gl96d"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.323552 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29534926-k666h"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.323992 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29534925-7hrlx" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.324359 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534926-k666h" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.331910 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.332545 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.333569 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6tqj"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.338824 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-djlxf"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.338953 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5nm8c"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.378865 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/66d0db3d-0908-48d2-be73-e1afd790f010-audit\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.378900 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3241c5-45b5-464d-866b-690bcedf1934-trusted-ca-bundle\") pod \"console-f9d7485db-kv2xl\" (UID: \"7c3241c5-45b5-464d-866b-690bcedf1934\") " pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.378926 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66d0db3d-0908-48d2-be73-e1afd790f010-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.378944 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwdzt\" (UniqueName: \"kubernetes.io/projected/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-kube-api-access-mwdzt\") pod \"route-controller-manager-6576b87f9c-pv5bb\" (UID: \"60dcb438-eb57-425d-a9a0-c1dfe9a884e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.378965 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjpvm\" (UniqueName: \"kubernetes.io/projected/43219832-a329-442e-8bb7-200a4deff15d-kube-api-access-zjpvm\") pod \"control-plane-machine-set-operator-78cbb6b69f-94fn2\" (UID: \"43219832-a329-442e-8bb7-200a4deff15d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-94fn2" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.378980 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/60428194-e5a3-4578-872a-6d564e7b0a7c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9jwb2\" (UID: \"60428194-e5a3-4578-872a-6d564e7b0a7c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jwb2" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.378997 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9a7e81a4-3454-4467-8355-9a19f0274073-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gl96d\" (UID: \"9a7e81a4-3454-4467-8355-9a19f0274073\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gl96d" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379013 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a7e81a4-3454-4467-8355-9a19f0274073-serving-cert\") pod \"openshift-config-operator-7777fb866f-gl96d\" (UID: \"9a7e81a4-3454-4467-8355-9a19f0274073\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gl96d" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379028 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66d0db3d-0908-48d2-be73-e1afd790f010-etcd-client\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379046 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/43219832-a329-442e-8bb7-200a4deff15d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-94fn2\" (UID: \"43219832-a329-442e-8bb7-200a4deff15d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-94fn2" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379063 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66d0db3d-0908-48d2-be73-e1afd790f010-audit-dir\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379078 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-serving-cert\") pod \"route-controller-manager-6576b87f9c-pv5bb\" (UID: \"60dcb438-eb57-425d-a9a0-c1dfe9a884e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379103 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn78j\" (UniqueName: \"kubernetes.io/projected/cd116d5f-0452-44f0-ac17-2a0e2e083b93-kube-api-access-wn78j\") pod \"downloads-7954f5f757-b5l7b\" (UID: \"cd116d5f-0452-44f0-ac17-2a0e2e083b93\") " pod="openshift-console/downloads-7954f5f757-b5l7b" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379122 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sdch\" (UniqueName: \"kubernetes.io/projected/3423b879-0617-45e9-a197-11d3c67141b7-kube-api-access-8sdch\") pod \"cluster-samples-operator-665b6dd947-gdr9z\" (UID: \"3423b879-0617-45e9-a197-11d3c67141b7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdr9z" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379137 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/66d0db3d-0908-48d2-be73-e1afd790f010-image-import-ca\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379153 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66d0db3d-0908-48d2-be73-e1afd790f010-serving-cert\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379183 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3423b879-0617-45e9-a197-11d3c67141b7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gdr9z\" (UID: \"3423b879-0617-45e9-a197-11d3c67141b7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdr9z" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379200 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7c3241c5-45b5-464d-866b-690bcedf1934-oauth-serving-cert\") pod \"console-f9d7485db-kv2xl\" (UID: \"7c3241c5-45b5-464d-866b-690bcedf1934\") " pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379214 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngqk7\" (UniqueName: \"kubernetes.io/projected/7c3241c5-45b5-464d-866b-690bcedf1934-kube-api-access-ngqk7\") pod \"console-f9d7485db-kv2xl\" (UID: \"7c3241c5-45b5-464d-866b-690bcedf1934\") " pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379230 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60428194-e5a3-4578-872a-6d564e7b0a7c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9jwb2\" (UID: \"60428194-e5a3-4578-872a-6d564e7b0a7c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jwb2" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379254 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvkts\" (UniqueName: \"kubernetes.io/projected/60428194-e5a3-4578-872a-6d564e7b0a7c-kube-api-access-rvkts\") pod \"cluster-image-registry-operator-dc59b4c8b-9jwb2\" (UID: \"60428194-e5a3-4578-872a-6d564e7b0a7c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jwb2" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379289 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-config\") pod \"route-controller-manager-6576b87f9c-pv5bb\" (UID: \"60dcb438-eb57-425d-a9a0-c1dfe9a884e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379312 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66d0db3d-0908-48d2-be73-e1afd790f010-etcd-serving-ca\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379329 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7c3241c5-45b5-464d-866b-690bcedf1934-service-ca\") pod \"console-f9d7485db-kv2xl\" (UID: \"7c3241c5-45b5-464d-866b-690bcedf1934\") " pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379345 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sctqn\" (UniqueName: \"kubernetes.io/projected/9a7e81a4-3454-4467-8355-9a19f0274073-kube-api-access-sctqn\") pod \"openshift-config-operator-7777fb866f-gl96d\" (UID: \"9a7e81a4-3454-4467-8355-9a19f0274073\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gl96d" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379361 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66d0db3d-0908-48d2-be73-e1afd790f010-config\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379377 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66d0db3d-0908-48d2-be73-e1afd790f010-encryption-config\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379392 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/66d0db3d-0908-48d2-be73-e1afd790f010-node-pullsecrets\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379407 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7c3241c5-45b5-464d-866b-690bcedf1934-console-config\") pod \"console-f9d7485db-kv2xl\" (UID: \"7c3241c5-45b5-464d-866b-690bcedf1934\") " pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379431 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7c3241c5-45b5-464d-866b-690bcedf1934-console-oauth-config\") pod \"console-f9d7485db-kv2xl\" (UID: \"7c3241c5-45b5-464d-866b-690bcedf1934\") " pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379450 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwfkx\" (UniqueName: \"kubernetes.io/projected/66d0db3d-0908-48d2-be73-e1afd790f010-kube-api-access-bwfkx\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379466 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c3241c5-45b5-464d-866b-690bcedf1934-console-serving-cert\") pod \"console-f9d7485db-kv2xl\" (UID: \"7c3241c5-45b5-464d-866b-690bcedf1934\") " pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379484 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60428194-e5a3-4578-872a-6d564e7b0a7c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9jwb2\" (UID: \"60428194-e5a3-4578-872a-6d564e7b0a7c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jwb2" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.379499 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-client-ca\") pod \"route-controller-manager-6576b87f9c-pv5bb\" (UID: \"60dcb438-eb57-425d-a9a0-c1dfe9a884e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.380276 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-client-ca\") pod \"route-controller-manager-6576b87f9c-pv5bb\" (UID: \"60dcb438-eb57-425d-a9a0-c1dfe9a884e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.380896 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/66d0db3d-0908-48d2-be73-e1afd790f010-audit\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.382099 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7c3241c5-45b5-464d-866b-690bcedf1934-oauth-serving-cert\") pod \"console-f9d7485db-kv2xl\" (UID: \"7c3241c5-45b5-464d-866b-690bcedf1934\") " pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.382224 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66d0db3d-0908-48d2-be73-e1afd790f010-audit-dir\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.382768 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/66d0db3d-0908-48d2-be73-e1afd790f010-node-pullsecrets\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.382894 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c3241c5-45b5-464d-866b-690bcedf1934-trusted-ca-bundle\") pod \"console-f9d7485db-kv2xl\" (UID: \"7c3241c5-45b5-464d-866b-690bcedf1934\") " pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.384209 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pxx78"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.386384 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mzkdl"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.386501 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-b5l7b"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.386601 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdr9z"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.386670 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-f5g2n"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.386135 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-config\") pod \"route-controller-manager-6576b87f9c-pv5bb\" (UID: \"60dcb438-eb57-425d-a9a0-c1dfe9a884e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.384584 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7c3241c5-45b5-464d-866b-690bcedf1934-console-config\") pod \"console-f9d7485db-kv2xl\" (UID: \"7c3241c5-45b5-464d-866b-690bcedf1934\") " pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.391439 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9a7e81a4-3454-4467-8355-9a19f0274073-available-featuregates\") pod \"openshift-config-operator-7777fb866f-gl96d\" (UID: \"9a7e81a4-3454-4467-8355-9a19f0274073\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gl96d" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.385058 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.384744 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66d0db3d-0908-48d2-be73-e1afd790f010-config\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.387585 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bpqx7"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.391740 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-btcxz"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.389852 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/66d0db3d-0908-48d2-be73-e1afd790f010-image-import-ca\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.385655 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.385737 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.392244 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-kv2xl"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.392587 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.393146 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6mjmn"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.393650 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66d0db3d-0908-48d2-be73-e1afd790f010-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.394120 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7c3241c5-45b5-464d-866b-690bcedf1934-service-ca\") pod \"console-f9d7485db-kv2xl\" (UID: \"7c3241c5-45b5-464d-866b-690bcedf1934\") " pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.394266 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a7e81a4-3454-4467-8355-9a19f0274073-serving-cert\") pod \"openshift-config-operator-7777fb866f-gl96d\" (UID: \"9a7e81a4-3454-4467-8355-9a19f0274073\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gl96d" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.394351 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jwb2"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.395516 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-r6b4b"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.396495 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r6b4b" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.396683 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hjqhw"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.397469 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hjqhw" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.397911 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3423b879-0617-45e9-a197-11d3c67141b7-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-gdr9z\" (UID: \"3423b879-0617-45e9-a197-11d3c67141b7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdr9z" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.398278 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-94fn2"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.400378 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rrzqs"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.401389 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-48gkw"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.402090 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-t62dr"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.403306 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tv47r"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.404507 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-c982t"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.405452 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7c3241c5-45b5-464d-866b-690bcedf1934-console-oauth-config\") pod \"console-f9d7485db-kv2xl\" (UID: \"7c3241c5-45b5-464d-866b-690bcedf1934\") " pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.405648 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-serving-cert\") pod \"route-controller-manager-6576b87f9c-pv5bb\" (UID: \"60dcb438-eb57-425d-a9a0-c1dfe9a884e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.405768 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29534926-k666h"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.406077 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66d0db3d-0908-48d2-be73-e1afd790f010-encryption-config\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.406988 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8cn5f"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.407790 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66d0db3d-0908-48d2-be73-e1afd790f010-etcd-serving-ca\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.407813 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66d0db3d-0908-48d2-be73-e1afd790f010-etcd-client\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.408179 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r6b4b"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.408956 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66d0db3d-0908-48d2-be73-e1afd790f010-serving-cert\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.409680 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4w7fl"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.410157 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c3241c5-45b5-464d-866b-690bcedf1934-console-serving-cert\") pod \"console-f9d7485db-kv2xl\" (UID: \"7c3241c5-45b5-464d-866b-690bcedf1934\") " pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.411569 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fgfsf"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.411801 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.412625 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dd52h"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.415495 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tldwj"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.416880 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-h4mp7"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.418603 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nsvcl"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.419983 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hjqhw"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.421690 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k78cx"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.423356 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29534925-7hrlx"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.427339 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.428702 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-k2dxr"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.430006 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5nm8c"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.431336 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-n2g7g"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.431544 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.432702 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-n2g7g" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.432779 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6nml7"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.434135 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4pmw2"] Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.452034 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.471856 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.491501 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.512147 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.532178 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.551577 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.571453 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.597210 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.607191 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60428194-e5a3-4578-872a-6d564e7b0a7c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9jwb2\" (UID: \"60428194-e5a3-4578-872a-6d564e7b0a7c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jwb2" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.611959 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.631245 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.639449 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/60428194-e5a3-4578-872a-6d564e7b0a7c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9jwb2\" (UID: \"60428194-e5a3-4578-872a-6d564e7b0a7c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jwb2" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.651508 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.654424 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/43219832-a329-442e-8bb7-200a4deff15d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-94fn2\" (UID: \"43219832-a329-442e-8bb7-200a4deff15d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-94fn2" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.673126 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.692191 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.731636 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.752183 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.771521 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.792014 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.811931 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.832296 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.852895 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.873411 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.892398 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.911719 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.932366 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.952286 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.971871 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 26 08:47:52 crc kubenswrapper[4925]: I0226 08:47:52.992300 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.012110 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.032788 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.052355 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.072298 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.093062 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.111731 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.132837 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.152972 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.171745 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.192825 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.212701 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.232850 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.250926 4925 request.go:700] Waited for 1.000129357s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.252411 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.272356 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.293784 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.312512 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.337376 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.351470 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.371934 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.393076 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.432733 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.451491 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.471414 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.492568 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.512796 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.532627 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.552426 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.572583 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.593189 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.612749 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.632702 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.652835 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.673434 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.692209 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.712489 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.733339 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.752600 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.772364 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.793244 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.795137 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:53 crc kubenswrapper[4925]: E0226 08:47:53.795276 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:49:55.795249023 +0000 UTC m=+327.934823316 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.795375 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.795431 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.795523 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs\") pod \"network-metrics-daemon-96njb\" (UID: \"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\") " pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.795611 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.795672 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.796871 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.799499 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc-metrics-certs\") pod \"network-metrics-daemon-96njb\" (UID: \"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc\") " pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.800609 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.800726 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.801414 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.813036 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.831899 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.852764 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.872125 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.892590 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.912462 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.950438 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngqk7\" (UniqueName: \"kubernetes.io/projected/7c3241c5-45b5-464d-866b-690bcedf1934-kube-api-access-ngqk7\") pod \"console-f9d7485db-kv2xl\" (UID: \"7c3241c5-45b5-464d-866b-690bcedf1934\") " pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.969943 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwfkx\" (UniqueName: \"kubernetes.io/projected/66d0db3d-0908-48d2-be73-e1afd790f010-kube-api-access-bwfkx\") pod \"apiserver-76f77b778f-nkgtm\" (UID: \"66d0db3d-0908-48d2-be73-e1afd790f010\") " pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:53 crc kubenswrapper[4925]: I0226 08:47:53.993887 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60428194-e5a3-4578-872a-6d564e7b0a7c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9jwb2\" (UID: \"60428194-e5a3-4578-872a-6d564e7b0a7c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jwb2" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.018818 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvkts\" (UniqueName: \"kubernetes.io/projected/60428194-e5a3-4578-872a-6d564e7b0a7c-kube-api-access-rvkts\") pod \"cluster-image-registry-operator-dc59b4c8b-9jwb2\" (UID: \"60428194-e5a3-4578-872a-6d564e7b0a7c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jwb2" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.026401 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.032888 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.033483 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn78j\" (UniqueName: \"kubernetes.io/projected/cd116d5f-0452-44f0-ac17-2a0e2e083b93-kube-api-access-wn78j\") pod \"downloads-7954f5f757-b5l7b\" (UID: \"cd116d5f-0452-44f0-ac17-2a0e2e083b93\") " pod="openshift-console/downloads-7954f5f757-b5l7b" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.041080 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.048570 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96njb" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.053934 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sdch\" (UniqueName: \"kubernetes.io/projected/3423b879-0617-45e9-a197-11d3c67141b7-kube-api-access-8sdch\") pod \"cluster-samples-operator-665b6dd947-gdr9z\" (UID: \"3423b879-0617-45e9-a197-11d3c67141b7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdr9z" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.082223 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwdzt\" (UniqueName: \"kubernetes.io/projected/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-kube-api-access-mwdzt\") pod \"route-controller-manager-6576b87f9c-pv5bb\" (UID: \"60dcb438-eb57-425d-a9a0-c1dfe9a884e2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.096063 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjpvm\" (UniqueName: \"kubernetes.io/projected/43219832-a329-442e-8bb7-200a4deff15d-kube-api-access-zjpvm\") pod \"control-plane-machine-set-operator-78cbb6b69f-94fn2\" (UID: \"43219832-a329-442e-8bb7-200a4deff15d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-94fn2" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.114595 4925 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.116191 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sctqn\" (UniqueName: \"kubernetes.io/projected/9a7e81a4-3454-4467-8355-9a19f0274073-kube-api-access-sctqn\") pod \"openshift-config-operator-7777fb866f-gl96d\" (UID: \"9a7e81a4-3454-4467-8355-9a19f0274073\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-gl96d" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.116399 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.124707 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdr9z" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.135003 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.152663 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.171959 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.192559 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.218906 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.233340 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.236902 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.249854 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.258207 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.271336 4925 request.go:700] Waited for 1.873619423s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/configmaps?fieldSelector=metadata.name%3Ddns-default&limit=500&resourceVersion=0 Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.273832 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.273873 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-b5l7b" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.292225 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.294375 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jwb2" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.302446 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-94fn2" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.312138 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.332405 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.350923 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gl96d" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.352460 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410049 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/705f399b-a347-4aa1-a502-7b9720e67d71-service-ca-bundle\") pod \"authentication-operator-69f744f599-bpqx7\" (UID: \"705f399b-a347-4aa1-a502-7b9720e67d71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bpqx7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410102 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-serving-cert\") pod \"controller-manager-879f6c89f-8hj5v\" (UID: \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410134 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8b5c\" (UniqueName: \"kubernetes.io/projected/5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d-kube-api-access-s8b5c\") pod \"machine-approver-56656f9798-chlnd\" (UID: \"5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chlnd" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410166 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqn2n\" (UniqueName: \"kubernetes.io/projected/37e33b34-0cc8-402b-b214-59e62bf03cfb-kube-api-access-jqn2n\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410195 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410222 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410251 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4zhr\" (UniqueName: \"kubernetes.io/projected/dd43ed89-2634-4c7a-a030-a468d0b1c481-kube-api-access-x4zhr\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410276 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fc3c69d-10be-4495-a469-8b975f26754f-service-ca-bundle\") pod \"router-default-5444994796-h5g2f\" (UID: \"0fc3c69d-10be-4495-a469-8b975f26754f\") " pod="openshift-ingress/router-default-5444994796-h5g2f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410301 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68011ad7-d008-4442-9505-4ff72735198e-config\") pod \"kube-controller-manager-operator-78b949d7b-tldwj\" (UID: \"68011ad7-d008-4442-9505-4ff72735198e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tldwj" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410334 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dd43ed89-2634-4c7a-a030-a468d0b1c481-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410364 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410389 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e-etcd-client\") pod \"etcd-operator-b45778765-djlxf\" (UID: \"3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-djlxf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410418 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6e131998-c782-4fda-bd5a-0fe0255c143d-images\") pod \"machine-api-operator-5694c8668f-pxx78\" (UID: \"6e131998-c782-4fda-bd5a-0fe0255c143d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxx78" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410442 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqlbp\" (UniqueName: \"kubernetes.io/projected/fa394c7c-78af-4696-8fc3-73e8301effd6-kube-api-access-sqlbp\") pod \"dns-operator-744455d44c-nsvcl\" (UID: \"fa394c7c-78af-4696-8fc3-73e8301effd6\") " pod="openshift-dns-operator/dns-operator-744455d44c-nsvcl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410468 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e-config\") pod \"etcd-operator-b45778765-djlxf\" (UID: \"3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-djlxf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410493 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dd43ed89-2634-4c7a-a030-a468d0b1c481-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410518 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd43ed89-2634-4c7a-a030-a468d0b1c481-bound-sa-token\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410564 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b59f8920-d5ae-4e79-9923-75219a4e00fa-config\") pod \"console-operator-58897d9998-t62dr\" (UID: \"b59f8920-d5ae-4e79-9923-75219a4e00fa\") " pod="openshift-console-operator/console-operator-58897d9998-t62dr" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410589 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410618 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/996aef4b-8460-499a-82ff-eb4fc29e8ca5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-f6tqj\" (UID: \"996aef4b-8460-499a-82ff-eb4fc29e8ca5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6tqj" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410649 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37e33b34-0cc8-402b-b214-59e62bf03cfb-serving-cert\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410678 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37e33b34-0cc8-402b-b214-59e62bf03cfb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410704 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d90b9c6c-9c34-4d18-bda8-0bf6e482e71b-trusted-ca\") pod \"ingress-operator-5b745b69d9-f5g2n\" (UID: \"d90b9c6c-9c34-4d18-bda8-0bf6e482e71b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f5g2n" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410732 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d-config\") pod \"machine-approver-56656f9798-chlnd\" (UID: \"5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chlnd" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410758 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b59f8920-d5ae-4e79-9923-75219a4e00fa-trusted-ca\") pod \"console-operator-58897d9998-t62dr\" (UID: \"b59f8920-d5ae-4e79-9923-75219a4e00fa\") " pod="openshift-console-operator/console-operator-58897d9998-t62dr" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410787 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b91e7c6-743f-4b13-a9e9-a59dd125eaaa-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mzkdl\" (UID: \"2b91e7c6-743f-4b13-a9e9-a59dd125eaaa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mzkdl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410814 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c4fd3e3-031e-4c99-96f1-a0a5c84e92e9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6mjmn\" (UID: \"5c4fd3e3-031e-4c99-96f1-a0a5c84e92e9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6mjmn" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410838 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e-etcd-service-ca\") pod \"etcd-operator-b45778765-djlxf\" (UID: \"3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-djlxf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410867 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rk9z\" (UniqueName: \"kubernetes.io/projected/d90b9c6c-9c34-4d18-bda8-0bf6e482e71b-kube-api-access-2rk9z\") pod \"ingress-operator-5b745b69d9-f5g2n\" (UID: \"d90b9c6c-9c34-4d18-bda8-0bf6e482e71b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f5g2n" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410890 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-client-ca\") pod \"controller-manager-879f6c89f-8hj5v\" (UID: \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410915 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd43ed89-2634-4c7a-a030-a468d0b1c481-trusted-ca\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410943 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68011ad7-d008-4442-9505-4ff72735198e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tldwj\" (UID: \"68011ad7-d008-4442-9505-4ff72735198e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tldwj" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410970 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8hj5v\" (UID: \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.410998 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c4fd3e3-031e-4c99-96f1-a0a5c84e92e9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6mjmn\" (UID: \"5c4fd3e3-031e-4c99-96f1-a0a5c84e92e9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6mjmn" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411024 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmdgt\" (UniqueName: \"kubernetes.io/projected/2b91e7c6-743f-4b13-a9e9-a59dd125eaaa-kube-api-access-bmdgt\") pod \"openshift-apiserver-operator-796bbdcf4f-mzkdl\" (UID: \"2b91e7c6-743f-4b13-a9e9-a59dd125eaaa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mzkdl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411049 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d90b9c6c-9c34-4d18-bda8-0bf6e482e71b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-f5g2n\" (UID: \"d90b9c6c-9c34-4d18-bda8-0bf6e482e71b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f5g2n" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411076 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d90b9c6c-9c34-4d18-bda8-0bf6e482e71b-metrics-tls\") pod \"ingress-operator-5b745b69d9-f5g2n\" (UID: \"d90b9c6c-9c34-4d18-bda8-0bf6e482e71b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f5g2n" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411099 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b59f8920-d5ae-4e79-9923-75219a4e00fa-serving-cert\") pod \"console-operator-58897d9998-t62dr\" (UID: \"b59f8920-d5ae-4e79-9923-75219a4e00fa\") " pod="openshift-console-operator/console-operator-58897d9998-t62dr" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411122 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411150 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dd43ed89-2634-4c7a-a030-a468d0b1c481-registry-tls\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411173 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0fc3c69d-10be-4495-a469-8b975f26754f-default-certificate\") pod \"router-default-5444994796-h5g2f\" (UID: \"0fc3c69d-10be-4495-a469-8b975f26754f\") " pod="openshift-ingress/router-default-5444994796-h5g2f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411197 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fc3c69d-10be-4495-a469-8b975f26754f-metrics-certs\") pod \"router-default-5444994796-h5g2f\" (UID: \"0fc3c69d-10be-4495-a469-8b975f26754f\") " pod="openshift-ingress/router-default-5444994796-h5g2f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411222 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dd43ed89-2634-4c7a-a030-a468d0b1c481-registry-certificates\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411247 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvv2h\" (UniqueName: \"kubernetes.io/projected/705f399b-a347-4aa1-a502-7b9720e67d71-kube-api-access-nvv2h\") pod \"authentication-operator-69f744f599-bpqx7\" (UID: \"705f399b-a347-4aa1-a502-7b9720e67d71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bpqx7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411272 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e-serving-cert\") pod \"etcd-operator-b45778765-djlxf\" (UID: \"3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-djlxf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411299 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/996aef4b-8460-499a-82ff-eb4fc29e8ca5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-f6tqj\" (UID: \"996aef4b-8460-499a-82ff-eb4fc29e8ca5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6tqj" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411326 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c637160e-e623-415a-b0e4-9277dee94439-audit-dir\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411355 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/37e33b34-0cc8-402b-b214-59e62bf03cfb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411384 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xztd\" (UniqueName: \"kubernetes.io/projected/b59f8920-d5ae-4e79-9923-75219a4e00fa-kube-api-access-7xztd\") pod \"console-operator-58897d9998-t62dr\" (UID: \"b59f8920-d5ae-4e79-9923-75219a4e00fa\") " pod="openshift-console-operator/console-operator-58897d9998-t62dr" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411412 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/37e33b34-0cc8-402b-b214-59e62bf03cfb-encryption-config\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411441 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-config\") pod \"controller-manager-879f6c89f-8hj5v\" (UID: \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411468 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411497 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-audit-policies\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411519 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/705f399b-a347-4aa1-a502-7b9720e67d71-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bpqx7\" (UID: \"705f399b-a347-4aa1-a502-7b9720e67d71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bpqx7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411559 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d-auth-proxy-config\") pod \"machine-approver-56656f9798-chlnd\" (UID: \"5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chlnd" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411582 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411606 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/37e33b34-0cc8-402b-b214-59e62bf03cfb-audit-dir\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411633 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411655 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa394c7c-78af-4696-8fc3-73e8301effd6-metrics-tls\") pod \"dns-operator-744455d44c-nsvcl\" (UID: \"fa394c7c-78af-4696-8fc3-73e8301effd6\") " pod="openshift-dns-operator/dns-operator-744455d44c-nsvcl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411679 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxkn9\" (UniqueName: \"kubernetes.io/projected/996aef4b-8460-499a-82ff-eb4fc29e8ca5-kube-api-access-kxkn9\") pod \"openshift-controller-manager-operator-756b6f6bc6-f6tqj\" (UID: \"996aef4b-8460-499a-82ff-eb4fc29e8ca5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6tqj" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411701 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/37e33b34-0cc8-402b-b214-59e62bf03cfb-audit-policies\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411723 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c4fd3e3-031e-4c99-96f1-a0a5c84e92e9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6mjmn\" (UID: \"5c4fd3e3-031e-4c99-96f1-a0a5c84e92e9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6mjmn" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411744 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/705f399b-a347-4aa1-a502-7b9720e67d71-serving-cert\") pod \"authentication-operator-69f744f599-bpqx7\" (UID: \"705f399b-a347-4aa1-a502-7b9720e67d71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bpqx7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411782 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411810 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e131998-c782-4fda-bd5a-0fe0255c143d-config\") pod \"machine-api-operator-5694c8668f-pxx78\" (UID: \"6e131998-c782-4fda-bd5a-0fe0255c143d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxx78" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411839 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68011ad7-d008-4442-9505-4ff72735198e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tldwj\" (UID: \"68011ad7-d008-4442-9505-4ff72735198e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tldwj" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411864 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411888 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/37e33b34-0cc8-402b-b214-59e62bf03cfb-etcd-client\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411909 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d-machine-approver-tls\") pod \"machine-approver-56656f9798-chlnd\" (UID: \"5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chlnd" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411932 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84kkq\" (UniqueName: \"kubernetes.io/projected/3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e-kube-api-access-84kkq\") pod \"etcd-operator-b45778765-djlxf\" (UID: \"3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-djlxf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411955 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b91e7c6-743f-4b13-a9e9-a59dd125eaaa-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mzkdl\" (UID: \"2b91e7c6-743f-4b13-a9e9-a59dd125eaaa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mzkdl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.411976 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg8rw\" (UniqueName: \"kubernetes.io/projected/6e131998-c782-4fda-bd5a-0fe0255c143d-kube-api-access-sg8rw\") pod \"machine-api-operator-5694c8668f-pxx78\" (UID: \"6e131998-c782-4fda-bd5a-0fe0255c143d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxx78" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.412000 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.412025 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0fc3c69d-10be-4495-a469-8b975f26754f-stats-auth\") pod \"router-default-5444994796-h5g2f\" (UID: \"0fc3c69d-10be-4495-a469-8b975f26754f\") " pod="openshift-ingress/router-default-5444994796-h5g2f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.412048 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctf6s\" (UniqueName: \"kubernetes.io/projected/0fc3c69d-10be-4495-a469-8b975f26754f-kube-api-access-ctf6s\") pod \"router-default-5444994796-h5g2f\" (UID: \"0fc3c69d-10be-4495-a469-8b975f26754f\") " pod="openshift-ingress/router-default-5444994796-h5g2f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.412075 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e-etcd-ca\") pod \"etcd-operator-b45778765-djlxf\" (UID: \"3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-djlxf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.412098 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6e131998-c782-4fda-bd5a-0fe0255c143d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pxx78\" (UID: \"6e131998-c782-4fda-bd5a-0fe0255c143d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxx78" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.412122 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.412149 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57w97\" (UniqueName: \"kubernetes.io/projected/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-kube-api-access-57w97\") pod \"controller-manager-879f6c89f-8hj5v\" (UID: \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.412173 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gcl4\" (UniqueName: \"kubernetes.io/projected/c637160e-e623-415a-b0e4-9277dee94439-kube-api-access-4gcl4\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.412282 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/705f399b-a347-4aa1-a502-7b9720e67d71-config\") pod \"authentication-operator-69f744f599-bpqx7\" (UID: \"705f399b-a347-4aa1-a502-7b9720e67d71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bpqx7" Feb 26 08:47:54 crc kubenswrapper[4925]: E0226 08:47:54.414074 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:54.914056097 +0000 UTC m=+207.053630380 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.512680 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:54 crc kubenswrapper[4925]: E0226 08:47:54.512980 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:55.012934652 +0000 UTC m=+207.152508945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.513059 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.513128 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95ec09f6-783a-4661-9312-2d8f96a33825-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tv47r\" (UID: \"95ec09f6-783a-4661-9312-2d8f96a33825\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tv47r" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.513239 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d3eefc5b-29c8-4016-83b9-f178e1de9a52-apiservice-cert\") pod \"packageserver-d55dfcdfc-6kdtm\" (UID: \"d3eefc5b-29c8-4016-83b9-f178e1de9a52\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.513281 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68011ad7-d008-4442-9505-4ff72735198e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tldwj\" (UID: \"68011ad7-d008-4442-9505-4ff72735198e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tldwj" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.513307 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/26d0f330-c397-4f6a-8478-a706dc0c1af2-profile-collector-cert\") pod \"catalog-operator-68c6474976-k78cx\" (UID: \"26d0f330-c397-4f6a-8478-a706dc0c1af2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k78cx" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.513337 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/37e33b34-0cc8-402b-b214-59e62bf03cfb-etcd-client\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.513367 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: E0226 08:47:54.513435 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:55.013413464 +0000 UTC m=+207.152987967 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.514953 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.515209 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95ec09f6-783a-4661-9312-2d8f96a33825-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tv47r\" (UID: \"95ec09f6-783a-4661-9312-2d8f96a33825\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tv47r" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.515242 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt6nm\" (UniqueName: \"kubernetes.io/projected/4d09752b-8e55-4212-b8b7-81adee9ffa6b-kube-api-access-dt6nm\") pod \"ingress-canary-r6b4b\" (UID: \"4d09752b-8e55-4212-b8b7-81adee9ffa6b\") " pod="openshift-ingress-canary/ingress-canary-r6b4b" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.515266 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/26d0f330-c397-4f6a-8478-a706dc0c1af2-srv-cert\") pod \"catalog-operator-68c6474976-k78cx\" (UID: \"26d0f330-c397-4f6a-8478-a706dc0c1af2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k78cx" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.515299 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg8rw\" (UniqueName: \"kubernetes.io/projected/6e131998-c782-4fda-bd5a-0fe0255c143d-kube-api-access-sg8rw\") pod \"machine-api-operator-5694c8668f-pxx78\" (UID: \"6e131998-c782-4fda-bd5a-0fe0255c143d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxx78" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.515322 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/24e97113-beed-4628-94f1-bff371e2fecd-images\") pod \"machine-config-operator-74547568cd-k2dxr\" (UID: \"24e97113-beed-4628-94f1-bff371e2fecd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k2dxr" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.515349 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0fc3c69d-10be-4495-a469-8b975f26754f-stats-auth\") pod \"router-default-5444994796-h5g2f\" (UID: \"0fc3c69d-10be-4495-a469-8b975f26754f\") " pod="openshift-ingress/router-default-5444994796-h5g2f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.515376 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e-etcd-ca\") pod \"etcd-operator-b45778765-djlxf\" (UID: \"3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-djlxf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.515402 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6e131998-c782-4fda-bd5a-0fe0255c143d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pxx78\" (UID: \"6e131998-c782-4fda-bd5a-0fe0255c143d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxx78" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.515423 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.515449 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/705f399b-a347-4aa1-a502-7b9720e67d71-config\") pod \"authentication-operator-69f744f599-bpqx7\" (UID: \"705f399b-a347-4aa1-a502-7b9720e67d71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bpqx7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.515475 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62vcq\" (UniqueName: \"kubernetes.io/projected/3fa6dc4c-4a71-427a-9a01-567e0d4c0325-kube-api-access-62vcq\") pod \"olm-operator-6b444d44fb-dd52h\" (UID: \"3fa6dc4c-4a71-427a-9a01-567e0d4c0325\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dd52h" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.515507 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0af8dc28-2edf-4a81-bcb9-a5d54ccf8174-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-fgfsf\" (UID: \"0af8dc28-2edf-4a81-bcb9-a5d54ccf8174\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fgfsf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.515573 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqn2n\" (UniqueName: \"kubernetes.io/projected/37e33b34-0cc8-402b-b214-59e62bf03cfb-kube-api-access-jqn2n\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.515604 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8b5c\" (UniqueName: \"kubernetes.io/projected/5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d-kube-api-access-s8b5c\") pod \"machine-approver-56656f9798-chlnd\" (UID: \"5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chlnd" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.515629 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.515654 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.515679 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68011ad7-d008-4442-9505-4ff72735198e-config\") pod \"kube-controller-manager-operator-78b949d7b-tldwj\" (UID: \"68011ad7-d008-4442-9505-4ff72735198e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tldwj" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.515702 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.516932 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e-etcd-ca\") pod \"etcd-operator-b45778765-djlxf\" (UID: \"3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-djlxf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.515728 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e-config\") pod \"etcd-operator-b45778765-djlxf\" (UID: \"3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-djlxf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.517709 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e-etcd-client\") pod \"etcd-operator-b45778765-djlxf\" (UID: \"3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-djlxf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.517738 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6e131998-c782-4fda-bd5a-0fe0255c143d-images\") pod \"machine-api-operator-5694c8668f-pxx78\" (UID: \"6e131998-c782-4fda-bd5a-0fe0255c143d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxx78" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.517766 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dabc4264-eb11-4a46-ad16-75e286b8f776-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6nml7\" (UID: \"dabc4264-eb11-4a46-ad16-75e286b8f776\") " pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.517793 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gg9g\" (UniqueName: \"kubernetes.io/projected/0e938a2d-afc6-4360-aa50-e93c71d2c8c3-kube-api-access-8gg9g\") pod \"auto-csr-approver-29534926-k666h\" (UID: \"0e938a2d-afc6-4360-aa50-e93c71d2c8c3\") " pod="openshift-infra/auto-csr-approver-29534926-k666h" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.517822 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b59f8920-d5ae-4e79-9923-75219a4e00fa-config\") pod \"console-operator-58897d9998-t62dr\" (UID: \"b59f8920-d5ae-4e79-9923-75219a4e00fa\") " pod="openshift-console-operator/console-operator-58897d9998-t62dr" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.517843 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8bdc47f-6d95-402a-95a7-2f61af7a65ee-config\") pod \"service-ca-operator-777779d784-4w7fl\" (UID: \"c8bdc47f-6d95-402a-95a7-2f61af7a65ee\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4w7fl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.517869 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/996aef4b-8460-499a-82ff-eb4fc29e8ca5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-f6tqj\" (UID: \"996aef4b-8460-499a-82ff-eb4fc29e8ca5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6tqj" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.517898 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3fa6dc4c-4a71-427a-9a01-567e0d4c0325-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dd52h\" (UID: \"3fa6dc4c-4a71-427a-9a01-567e0d4c0325\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dd52h" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.517922 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b654495a-8604-48ee-9a86-32c178849124-node-bootstrap-token\") pod \"machine-config-server-n2g7g\" (UID: \"b654495a-8604-48ee-9a86-32c178849124\") " pod="openshift-machine-config-operator/machine-config-server-n2g7g" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.517965 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37e33b34-0cc8-402b-b214-59e62bf03cfb-serving-cert\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.517992 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37e33b34-0cc8-402b-b214-59e62bf03cfb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518018 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b654495a-8604-48ee-9a86-32c178849124-certs\") pod \"machine-config-server-n2g7g\" (UID: \"b654495a-8604-48ee-9a86-32c178849124\") " pod="openshift-machine-config-operator/machine-config-server-n2g7g" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518041 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpb57\" (UniqueName: \"kubernetes.io/projected/e7bbf476-7ed6-40eb-ad7b-28ed2f1286fc-kube-api-access-tpb57\") pod \"machine-config-controller-84d6567774-8cn5f\" (UID: \"e7bbf476-7ed6-40eb-ad7b-28ed2f1286fc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8cn5f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518081 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b59f8920-d5ae-4e79-9923-75219a4e00fa-trusted-ca\") pod \"console-operator-58897d9998-t62dr\" (UID: \"b59f8920-d5ae-4e79-9923-75219a4e00fa\") " pod="openshift-console-operator/console-operator-58897d9998-t62dr" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518105 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cd55dbfe-4f78-44d1-85a2-7871197c0874-registration-dir\") pod \"csi-hostpathplugin-5nm8c\" (UID: \"cd55dbfe-4f78-44d1-85a2-7871197c0874\") " pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518152 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-client-ca\") pod \"controller-manager-879f6c89f-8hj5v\" (UID: \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518179 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/cd55dbfe-4f78-44d1-85a2-7871197c0874-plugins-dir\") pod \"csi-hostpathplugin-5nm8c\" (UID: \"cd55dbfe-4f78-44d1-85a2-7871197c0874\") " pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518204 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd43ed89-2634-4c7a-a030-a468d0b1c481-trusted-ca\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518250 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c4fd3e3-031e-4c99-96f1-a0a5c84e92e9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6mjmn\" (UID: \"5c4fd3e3-031e-4c99-96f1-a0a5c84e92e9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6mjmn" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518274 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68011ad7-d008-4442-9505-4ff72735198e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tldwj\" (UID: \"68011ad7-d008-4442-9505-4ff72735198e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tldwj" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518298 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8hj5v\" (UID: \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518325 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmdgt\" (UniqueName: \"kubernetes.io/projected/2b91e7c6-743f-4b13-a9e9-a59dd125eaaa-kube-api-access-bmdgt\") pod \"openshift-apiserver-operator-796bbdcf4f-mzkdl\" (UID: \"2b91e7c6-743f-4b13-a9e9-a59dd125eaaa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mzkdl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518352 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d90b9c6c-9c34-4d18-bda8-0bf6e482e71b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-f5g2n\" (UID: \"d90b9c6c-9c34-4d18-bda8-0bf6e482e71b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f5g2n" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518378 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3fa6dc4c-4a71-427a-9a01-567e0d4c0325-srv-cert\") pod \"olm-operator-6b444d44fb-dd52h\" (UID: \"3fa6dc4c-4a71-427a-9a01-567e0d4c0325\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dd52h" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518408 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0fc3c69d-10be-4495-a469-8b975f26754f-default-certificate\") pod \"router-default-5444994796-h5g2f\" (UID: \"0fc3c69d-10be-4495-a469-8b975f26754f\") " pod="openshift-ingress/router-default-5444994796-h5g2f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518430 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fc3c69d-10be-4495-a469-8b975f26754f-metrics-certs\") pod \"router-default-5444994796-h5g2f\" (UID: \"0fc3c69d-10be-4495-a469-8b975f26754f\") " pod="openshift-ingress/router-default-5444994796-h5g2f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518456 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d90b9c6c-9c34-4d18-bda8-0bf6e482e71b-metrics-tls\") pod \"ingress-operator-5b745b69d9-f5g2n\" (UID: \"d90b9c6c-9c34-4d18-bda8-0bf6e482e71b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f5g2n" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518485 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rvvr\" (UniqueName: \"kubernetes.io/projected/c8bdc47f-6d95-402a-95a7-2f61af7a65ee-kube-api-access-4rvvr\") pod \"service-ca-operator-777779d784-4w7fl\" (UID: \"c8bdc47f-6d95-402a-95a7-2f61af7a65ee\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4w7fl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518469 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518727 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvv2h\" (UniqueName: \"kubernetes.io/projected/705f399b-a347-4aa1-a502-7b9720e67d71-kube-api-access-nvv2h\") pod \"authentication-operator-69f744f599-bpqx7\" (UID: \"705f399b-a347-4aa1-a502-7b9720e67d71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bpqx7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518755 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e-serving-cert\") pod \"etcd-operator-b45778765-djlxf\" (UID: \"3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-djlxf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518794 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/996aef4b-8460-499a-82ff-eb4fc29e8ca5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-f6tqj\" (UID: \"996aef4b-8460-499a-82ff-eb4fc29e8ca5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6tqj" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518822 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwfzh\" (UniqueName: \"kubernetes.io/projected/d3eefc5b-29c8-4016-83b9-f178e1de9a52-kube-api-access-fwfzh\") pod \"packageserver-d55dfcdfc-6kdtm\" (UID: \"d3eefc5b-29c8-4016-83b9-f178e1de9a52\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518896 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-config\") pod \"controller-manager-879f6c89f-8hj5v\" (UID: \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518920 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d3eefc5b-29c8-4016-83b9-f178e1de9a52-webhook-cert\") pod \"packageserver-d55dfcdfc-6kdtm\" (UID: \"d3eefc5b-29c8-4016-83b9-f178e1de9a52\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.521366 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-audit-policies\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.521420 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/37e33b34-0cc8-402b-b214-59e62bf03cfb-audit-dir\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.521485 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.521518 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f1854f8f-c17d-479c-b0f4-46159b9d0da8-signing-cabundle\") pod \"service-ca-9c57cc56f-h4mp7\" (UID: \"f1854f8f-c17d-479c-b0f4-46159b9d0da8\") " pod="openshift-service-ca/service-ca-9c57cc56f-h4mp7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.521597 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g697\" (UniqueName: \"kubernetes.io/projected/b654495a-8604-48ee-9a86-32c178849124-kube-api-access-7g697\") pod \"machine-config-server-n2g7g\" (UID: \"b654495a-8604-48ee-9a86-32c178849124\") " pod="openshift-machine-config-operator/machine-config-server-n2g7g" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.521627 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa394c7c-78af-4696-8fc3-73e8301effd6-metrics-tls\") pod \"dns-operator-744455d44c-nsvcl\" (UID: \"fa394c7c-78af-4696-8fc3-73e8301effd6\") " pod="openshift-dns-operator/dns-operator-744455d44c-nsvcl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.521653 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b412aae4-dae9-42ff-96e6-8e982f62eb11-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-48gkw\" (UID: \"b412aae4-dae9-42ff-96e6-8e982f62eb11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-48gkw" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.521677 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d09752b-8e55-4212-b8b7-81adee9ffa6b-cert\") pod \"ingress-canary-r6b4b\" (UID: \"4d09752b-8e55-4212-b8b7-81adee9ffa6b\") " pod="openshift-ingress-canary/ingress-canary-r6b4b" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.521820 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/705f399b-a347-4aa1-a502-7b9720e67d71-serving-cert\") pod \"authentication-operator-69f744f599-bpqx7\" (UID: \"705f399b-a347-4aa1-a502-7b9720e67d71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bpqx7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.521846 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67pdr\" (UniqueName: \"kubernetes.io/projected/dabc4264-eb11-4a46-ad16-75e286b8f776-kube-api-access-67pdr\") pod \"marketplace-operator-79b997595-6nml7\" (UID: \"dabc4264-eb11-4a46-ad16-75e286b8f776\") " pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.521874 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/cd55dbfe-4f78-44d1-85a2-7871197c0874-csi-data-dir\") pod \"csi-hostpathplugin-5nm8c\" (UID: \"cd55dbfe-4f78-44d1-85a2-7871197c0874\") " pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.521924 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e131998-c782-4fda-bd5a-0fe0255c143d-config\") pod \"machine-api-operator-5694c8668f-pxx78\" (UID: \"6e131998-c782-4fda-bd5a-0fe0255c143d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxx78" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.521957 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d-machine-approver-tls\") pod \"machine-approver-56656f9798-chlnd\" (UID: \"5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chlnd" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.521989 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95ec09f6-783a-4661-9312-2d8f96a33825-config\") pod \"kube-apiserver-operator-766d6c64bb-tv47r\" (UID: \"95ec09f6-783a-4661-9312-2d8f96a33825\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tv47r" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.522014 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f1854f8f-c17d-479c-b0f4-46159b9d0da8-signing-key\") pod \"service-ca-9c57cc56f-h4mp7\" (UID: \"f1854f8f-c17d-479c-b0f4-46159b9d0da8\") " pod="openshift-service-ca/service-ca-9c57cc56f-h4mp7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.522052 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74ac0fc1-580a-4095-a2aa-c4a5c1796ae0-metrics-tls\") pod \"dns-default-hjqhw\" (UID: \"74ac0fc1-580a-4095-a2aa-c4a5c1796ae0\") " pod="openshift-dns/dns-default-hjqhw" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.522082 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd490e7f-4d64-4fc3-9c8d-17236968eafd-config-volume\") pod \"collect-profiles-29534925-7hrlx\" (UID: \"dd490e7f-4d64-4fc3-9c8d-17236968eafd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29534925-7hrlx" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.522110 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84kkq\" (UniqueName: \"kubernetes.io/projected/3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e-kube-api-access-84kkq\") pod \"etcd-operator-b45778765-djlxf\" (UID: \"3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-djlxf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.522136 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.522163 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b91e7c6-743f-4b13-a9e9-a59dd125eaaa-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mzkdl\" (UID: \"2b91e7c6-743f-4b13-a9e9-a59dd125eaaa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mzkdl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.522194 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42krz\" (UniqueName: \"kubernetes.io/projected/0af8dc28-2edf-4a81-bcb9-a5d54ccf8174-kube-api-access-42krz\") pod \"package-server-manager-789f6589d5-fgfsf\" (UID: \"0af8dc28-2edf-4a81-bcb9-a5d54ccf8174\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fgfsf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.522224 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7bbf476-7ed6-40eb-ad7b-28ed2f1286fc-proxy-tls\") pod \"machine-config-controller-84d6567774-8cn5f\" (UID: \"e7bbf476-7ed6-40eb-ad7b-28ed2f1286fc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8cn5f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.522331 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctf6s\" (UniqueName: \"kubernetes.io/projected/0fc3c69d-10be-4495-a469-8b975f26754f-kube-api-access-ctf6s\") pod \"router-default-5444994796-h5g2f\" (UID: \"0fc3c69d-10be-4495-a469-8b975f26754f\") " pod="openshift-ingress/router-default-5444994796-h5g2f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.522355 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd43ed89-2634-4c7a-a030-a468d0b1c481-trusted-ca\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.519352 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.522363 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dabc4264-eb11-4a46-ad16-75e286b8f776-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6nml7\" (UID: \"dabc4264-eb11-4a46-ad16-75e286b8f776\") " pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.522450 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvxvq\" (UniqueName: \"kubernetes.io/projected/4bae7398-7181-4ce3-bc77-e1bafe05fe52-kube-api-access-rvxvq\") pod \"multus-admission-controller-857f4d67dd-c982t\" (UID: \"4bae7398-7181-4ce3-bc77-e1bafe05fe52\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c982t" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.522493 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xr54\" (UniqueName: \"kubernetes.io/projected/b412aae4-dae9-42ff-96e6-8e982f62eb11-kube-api-access-7xr54\") pod \"kube-storage-version-migrator-operator-b67b599dd-48gkw\" (UID: \"b412aae4-dae9-42ff-96e6-8e982f62eb11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-48gkw" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.522517 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znrhn\" (UniqueName: \"kubernetes.io/projected/f1854f8f-c17d-479c-b0f4-46159b9d0da8-kube-api-access-znrhn\") pod \"service-ca-9c57cc56f-h4mp7\" (UID: \"f1854f8f-c17d-479c-b0f4-46159b9d0da8\") " pod="openshift-service-ca/service-ca-9c57cc56f-h4mp7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.523136 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-audit-policies\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.519959 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68011ad7-d008-4442-9505-4ff72735198e-config\") pod \"kube-controller-manager-operator-78b949d7b-tldwj\" (UID: \"68011ad7-d008-4442-9505-4ff72735198e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tldwj" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.523197 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/37e33b34-0cc8-402b-b214-59e62bf03cfb-audit-dir\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.520174 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8hj5v\" (UID: \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.523666 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e-config\") pod \"etcd-operator-b45778765-djlxf\" (UID: \"3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-djlxf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.518320 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/705f399b-a347-4aa1-a502-7b9720e67d71-config\") pod \"authentication-operator-69f744f599-bpqx7\" (UID: \"705f399b-a347-4aa1-a502-7b9720e67d71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bpqx7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.520270 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b59f8920-d5ae-4e79-9923-75219a4e00fa-trusted-ca\") pod \"console-operator-58897d9998-t62dr\" (UID: \"b59f8920-d5ae-4e79-9923-75219a4e00fa\") " pod="openshift-console-operator/console-operator-58897d9998-t62dr" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.520363 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-client-ca\") pod \"controller-manager-879f6c89f-8hj5v\" (UID: \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.521034 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/996aef4b-8460-499a-82ff-eb4fc29e8ca5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-f6tqj\" (UID: \"996aef4b-8460-499a-82ff-eb4fc29e8ca5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6tqj" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.523900 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b59f8920-d5ae-4e79-9923-75219a4e00fa-config\") pod \"console-operator-58897d9998-t62dr\" (UID: \"b59f8920-d5ae-4e79-9923-75219a4e00fa\") " pod="openshift-console-operator/console-operator-58897d9998-t62dr" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.524490 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-config\") pod \"controller-manager-879f6c89f-8hj5v\" (UID: \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.524644 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57w97\" (UniqueName: \"kubernetes.io/projected/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-kube-api-access-57w97\") pod \"controller-manager-879f6c89f-8hj5v\" (UID: \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.524689 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gcl4\" (UniqueName: \"kubernetes.io/projected/c637160e-e623-415a-b0e4-9277dee94439-kube-api-access-4gcl4\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.524723 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e7bbf476-7ed6-40eb-ad7b-28ed2f1286fc-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8cn5f\" (UID: \"e7bbf476-7ed6-40eb-ad7b-28ed2f1286fc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8cn5f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.524759 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/705f399b-a347-4aa1-a502-7b9720e67d71-service-ca-bundle\") pod \"authentication-operator-69f744f599-bpqx7\" (UID: \"705f399b-a347-4aa1-a502-7b9720e67d71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bpqx7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.524795 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-serving-cert\") pod \"controller-manager-879f6c89f-8hj5v\" (UID: \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.524825 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dd43ed89-2634-4c7a-a030-a468d0b1c481-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.524854 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4zhr\" (UniqueName: \"kubernetes.io/projected/dd43ed89-2634-4c7a-a030-a468d0b1c481-kube-api-access-x4zhr\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.524914 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fc3c69d-10be-4495-a469-8b975f26754f-service-ca-bundle\") pod \"router-default-5444994796-h5g2f\" (UID: \"0fc3c69d-10be-4495-a469-8b975f26754f\") " pod="openshift-ingress/router-default-5444994796-h5g2f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.524943 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8bdc47f-6d95-402a-95a7-2f61af7a65ee-serving-cert\") pod \"service-ca-operator-777779d784-4w7fl\" (UID: \"c8bdc47f-6d95-402a-95a7-2f61af7a65ee\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4w7fl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.524976 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqlbp\" (UniqueName: \"kubernetes.io/projected/fa394c7c-78af-4696-8fc3-73e8301effd6-kube-api-access-sqlbp\") pod \"dns-operator-744455d44c-nsvcl\" (UID: \"fa394c7c-78af-4696-8fc3-73e8301effd6\") " pod="openshift-dns-operator/dns-operator-744455d44c-nsvcl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.525026 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dd43ed89-2634-4c7a-a030-a468d0b1c481-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.525050 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd43ed89-2634-4c7a-a030-a468d0b1c481-bound-sa-token\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.525075 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.525138 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd490e7f-4d64-4fc3-9c8d-17236968eafd-secret-volume\") pod \"collect-profiles-29534925-7hrlx\" (UID: \"dd490e7f-4d64-4fc3-9c8d-17236968eafd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29534925-7hrlx" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.529890 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37e33b34-0cc8-402b-b214-59e62bf03cfb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.530865 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/705f399b-a347-4aa1-a502-7b9720e67d71-service-ca-bundle\") pod \"authentication-operator-69f744f599-bpqx7\" (UID: \"705f399b-a347-4aa1-a502-7b9720e67d71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bpqx7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.531094 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fc3c69d-10be-4495-a469-8b975f26754f-service-ca-bundle\") pod \"router-default-5444994796-h5g2f\" (UID: \"0fc3c69d-10be-4495-a469-8b975f26754f\") " pod="openshift-ingress/router-default-5444994796-h5g2f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.531839 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dd43ed89-2634-4c7a-a030-a468d0b1c481-ca-trust-extracted\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.532696 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d90b9c6c-9c34-4d18-bda8-0bf6e482e71b-trusted-ca\") pod \"ingress-operator-5b745b69d9-f5g2n\" (UID: \"d90b9c6c-9c34-4d18-bda8-0bf6e482e71b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f5g2n" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.532855 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d-config\") pod \"machine-approver-56656f9798-chlnd\" (UID: \"5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chlnd" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.532943 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e131998-c782-4fda-bd5a-0fe0255c143d-config\") pod \"machine-api-operator-5694c8668f-pxx78\" (UID: \"6e131998-c782-4fda-bd5a-0fe0255c143d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxx78" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.533682 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d-config\") pod \"machine-approver-56656f9798-chlnd\" (UID: \"5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chlnd" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.534247 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d90b9c6c-9c34-4d18-bda8-0bf6e482e71b-trusted-ca\") pod \"ingress-operator-5b745b69d9-f5g2n\" (UID: \"d90b9c6c-9c34-4d18-bda8-0bf6e482e71b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f5g2n" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.533113 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d3eefc5b-29c8-4016-83b9-f178e1de9a52-tmpfs\") pod \"packageserver-d55dfcdfc-6kdtm\" (UID: \"d3eefc5b-29c8-4016-83b9-f178e1de9a52\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.534431 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rk9z\" (UniqueName: \"kubernetes.io/projected/d90b9c6c-9c34-4d18-bda8-0bf6e482e71b-kube-api-access-2rk9z\") pod \"ingress-operator-5b745b69d9-f5g2n\" (UID: \"d90b9c6c-9c34-4d18-bda8-0bf6e482e71b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f5g2n" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.534470 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b91e7c6-743f-4b13-a9e9-a59dd125eaaa-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mzkdl\" (UID: \"2b91e7c6-743f-4b13-a9e9-a59dd125eaaa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mzkdl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.534491 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c4fd3e3-031e-4c99-96f1-a0a5c84e92e9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6mjmn\" (UID: \"5c4fd3e3-031e-4c99-96f1-a0a5c84e92e9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6mjmn" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.534511 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e-etcd-service-ca\") pod \"etcd-operator-b45778765-djlxf\" (UID: \"3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-djlxf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.534683 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/cd55dbfe-4f78-44d1-85a2-7871197c0874-mountpoint-dir\") pod \"csi-hostpathplugin-5nm8c\" (UID: \"cd55dbfe-4f78-44d1-85a2-7871197c0874\") " pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.534719 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6n9c\" (UniqueName: \"kubernetes.io/projected/dd490e7f-4d64-4fc3-9c8d-17236968eafd-kube-api-access-s6n9c\") pod \"collect-profiles-29534925-7hrlx\" (UID: \"dd490e7f-4d64-4fc3-9c8d-17236968eafd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29534925-7hrlx" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.534751 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzw6l\" (UniqueName: \"kubernetes.io/projected/9cd6715c-c655-46b0-8dd1-b8bfcb253db4-kube-api-access-lzw6l\") pod \"migrator-59844c95c7-4pmw2\" (UID: \"9cd6715c-c655-46b0-8dd1-b8bfcb253db4\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4pmw2" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.534780 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b412aae4-dae9-42ff-96e6-8e982f62eb11-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-48gkw\" (UID: \"b412aae4-dae9-42ff-96e6-8e982f62eb11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-48gkw" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.534805 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dd43ed89-2634-4c7a-a030-a468d0b1c481-registry-tls\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.534841 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b59f8920-d5ae-4e79-9923-75219a4e00fa-serving-cert\") pod \"console-operator-58897d9998-t62dr\" (UID: \"b59f8920-d5ae-4e79-9923-75219a4e00fa\") " pod="openshift-console-operator/console-operator-58897d9998-t62dr" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.534866 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.534909 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dd43ed89-2634-4c7a-a030-a468d0b1c481-registry-certificates\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.534928 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cd55dbfe-4f78-44d1-85a2-7871197c0874-socket-dir\") pod \"csi-hostpathplugin-5nm8c\" (UID: \"cd55dbfe-4f78-44d1-85a2-7871197c0874\") " pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.534947 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24e97113-beed-4628-94f1-bff371e2fecd-proxy-tls\") pod \"machine-config-operator-74547568cd-k2dxr\" (UID: \"24e97113-beed-4628-94f1-bff371e2fecd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k2dxr" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.534972 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c637160e-e623-415a-b0e4-9277dee94439-audit-dir\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.534991 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/37e33b34-0cc8-402b-b214-59e62bf03cfb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.535014 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xztd\" (UniqueName: \"kubernetes.io/projected/b59f8920-d5ae-4e79-9923-75219a4e00fa-kube-api-access-7xztd\") pod \"console-operator-58897d9998-t62dr\" (UID: \"b59f8920-d5ae-4e79-9923-75219a4e00fa\") " pod="openshift-console-operator/console-operator-58897d9998-t62dr" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.535036 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfc9b\" (UniqueName: \"kubernetes.io/projected/24e97113-beed-4628-94f1-bff371e2fecd-kube-api-access-sfc9b\") pod \"machine-config-operator-74547568cd-k2dxr\" (UID: \"24e97113-beed-4628-94f1-bff371e2fecd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k2dxr" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.535075 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/37e33b34-0cc8-402b-b214-59e62bf03cfb-encryption-config\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.535094 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.535112 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4bae7398-7181-4ce3-bc77-e1bafe05fe52-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-c982t\" (UID: \"4bae7398-7181-4ce3-bc77-e1bafe05fe52\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c982t" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.535136 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24e97113-beed-4628-94f1-bff371e2fecd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-k2dxr\" (UID: \"24e97113-beed-4628-94f1-bff371e2fecd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k2dxr" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.535159 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/705f399b-a347-4aa1-a502-7b9720e67d71-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bpqx7\" (UID: \"705f399b-a347-4aa1-a502-7b9720e67d71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bpqx7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.535176 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d-auth-proxy-config\") pod \"machine-approver-56656f9798-chlnd\" (UID: \"5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chlnd" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.535195 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.535216 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74ac0fc1-580a-4095-a2aa-c4a5c1796ae0-config-volume\") pod \"dns-default-hjqhw\" (UID: \"74ac0fc1-580a-4095-a2aa-c4a5c1796ae0\") " pod="openshift-dns/dns-default-hjqhw" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.535233 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n29x\" (UniqueName: \"kubernetes.io/projected/74ac0fc1-580a-4095-a2aa-c4a5c1796ae0-kube-api-access-2n29x\") pod \"dns-default-hjqhw\" (UID: \"74ac0fc1-580a-4095-a2aa-c4a5c1796ae0\") " pod="openshift-dns/dns-default-hjqhw" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.535267 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxkn9\" (UniqueName: \"kubernetes.io/projected/996aef4b-8460-499a-82ff-eb4fc29e8ca5-kube-api-access-kxkn9\") pod \"openshift-controller-manager-operator-756b6f6bc6-f6tqj\" (UID: \"996aef4b-8460-499a-82ff-eb4fc29e8ca5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6tqj" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.535288 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/37e33b34-0cc8-402b-b214-59e62bf03cfb-audit-policies\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.535308 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c4fd3e3-031e-4c99-96f1-a0a5c84e92e9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6mjmn\" (UID: \"5c4fd3e3-031e-4c99-96f1-a0a5c84e92e9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6mjmn" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.535327 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbhq6\" (UniqueName: \"kubernetes.io/projected/cd55dbfe-4f78-44d1-85a2-7871197c0874-kube-api-access-tbhq6\") pod \"csi-hostpathplugin-5nm8c\" (UID: \"cd55dbfe-4f78-44d1-85a2-7871197c0874\") " pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.535346 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e-etcd-service-ca\") pod \"etcd-operator-b45778765-djlxf\" (UID: \"3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-djlxf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.535384 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mshh5\" (UniqueName: \"kubernetes.io/projected/26d0f330-c397-4f6a-8478-a706dc0c1af2-kube-api-access-mshh5\") pod \"catalog-operator-68c6474976-k78cx\" (UID: \"26d0f330-c397-4f6a-8478-a706dc0c1af2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k78cx" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.535629 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c637160e-e623-415a-b0e4-9277dee94439-audit-dir\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.535769 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68011ad7-d008-4442-9505-4ff72735198e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tldwj\" (UID: \"68011ad7-d008-4442-9505-4ff72735198e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tldwj" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.536291 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/37e33b34-0cc8-402b-b214-59e62bf03cfb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.536963 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d-auth-proxy-config\") pod \"machine-approver-56656f9798-chlnd\" (UID: \"5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chlnd" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.537762 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b91e7c6-743f-4b13-a9e9-a59dd125eaaa-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mzkdl\" (UID: \"2b91e7c6-743f-4b13-a9e9-a59dd125eaaa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mzkdl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.540330 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6e131998-c782-4fda-bd5a-0fe0255c143d-images\") pod \"machine-api-operator-5694c8668f-pxx78\" (UID: \"6e131998-c782-4fda-bd5a-0fe0255c143d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxx78" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.540428 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c4fd3e3-031e-4c99-96f1-a0a5c84e92e9-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6mjmn\" (UID: \"5c4fd3e3-031e-4c99-96f1-a0a5c84e92e9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6mjmn" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.541115 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/37e33b34-0cc8-402b-b214-59e62bf03cfb-audit-policies\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.541746 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c4fd3e3-031e-4c99-96f1-a0a5c84e92e9-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6mjmn\" (UID: \"5c4fd3e3-031e-4c99-96f1-a0a5c84e92e9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6mjmn" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.541754 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/705f399b-a347-4aa1-a502-7b9720e67d71-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bpqx7\" (UID: \"705f399b-a347-4aa1-a502-7b9720e67d71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bpqx7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.544388 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d-machine-approver-tls\") pod \"machine-approver-56656f9798-chlnd\" (UID: \"5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chlnd" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.544481 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e-etcd-client\") pod \"etcd-operator-b45778765-djlxf\" (UID: \"3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-djlxf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.544554 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.544702 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.545914 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-serving-cert\") pod \"controller-manager-879f6c89f-8hj5v\" (UID: \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.546172 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.546354 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa394c7c-78af-4696-8fc3-73e8301effd6-metrics-tls\") pod \"dns-operator-744455d44c-nsvcl\" (UID: \"fa394c7c-78af-4696-8fc3-73e8301effd6\") " pod="openshift-dns-operator/dns-operator-744455d44c-nsvcl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.546729 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e-serving-cert\") pod \"etcd-operator-b45778765-djlxf\" (UID: \"3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-djlxf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.547079 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b91e7c6-743f-4b13-a9e9-a59dd125eaaa-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mzkdl\" (UID: \"2b91e7c6-743f-4b13-a9e9-a59dd125eaaa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mzkdl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.547906 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/37e33b34-0cc8-402b-b214-59e62bf03cfb-etcd-client\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.548409 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dd43ed89-2634-4c7a-a030-a468d0b1c481-installation-pull-secrets\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.548484 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6e131998-c782-4fda-bd5a-0fe0255c143d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pxx78\" (UID: \"6e131998-c782-4fda-bd5a-0fe0255c143d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxx78" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.548630 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37e33b34-0cc8-402b-b214-59e62bf03cfb-serving-cert\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.549175 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dd43ed89-2634-4c7a-a030-a468d0b1c481-registry-certificates\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.549347 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/996aef4b-8460-499a-82ff-eb4fc29e8ca5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-f6tqj\" (UID: \"996aef4b-8460-499a-82ff-eb4fc29e8ca5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6tqj" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.549404 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d90b9c6c-9c34-4d18-bda8-0bf6e482e71b-metrics-tls\") pod \"ingress-operator-5b745b69d9-f5g2n\" (UID: \"d90b9c6c-9c34-4d18-bda8-0bf6e482e71b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f5g2n" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.549715 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0fc3c69d-10be-4495-a469-8b975f26754f-stats-auth\") pod \"router-default-5444994796-h5g2f\" (UID: \"0fc3c69d-10be-4495-a469-8b975f26754f\") " pod="openshift-ingress/router-default-5444994796-h5g2f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.549788 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.550200 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fc3c69d-10be-4495-a469-8b975f26754f-metrics-certs\") pod \"router-default-5444994796-h5g2f\" (UID: \"0fc3c69d-10be-4495-a469-8b975f26754f\") " pod="openshift-ingress/router-default-5444994796-h5g2f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.551199 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0fc3c69d-10be-4495-a469-8b975f26754f-default-certificate\") pod \"router-default-5444994796-h5g2f\" (UID: \"0fc3c69d-10be-4495-a469-8b975f26754f\") " pod="openshift-ingress/router-default-5444994796-h5g2f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.551257 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b59f8920-d5ae-4e79-9923-75219a4e00fa-serving-cert\") pod \"console-operator-58897d9998-t62dr\" (UID: \"b59f8920-d5ae-4e79-9923-75219a4e00fa\") " pod="openshift-console-operator/console-operator-58897d9998-t62dr" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.551571 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/705f399b-a347-4aa1-a502-7b9720e67d71-serving-cert\") pod \"authentication-operator-69f744f599-bpqx7\" (UID: \"705f399b-a347-4aa1-a502-7b9720e67d71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bpqx7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.554119 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.554210 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg8rw\" (UniqueName: \"kubernetes.io/projected/6e131998-c782-4fda-bd5a-0fe0255c143d-kube-api-access-sg8rw\") pod \"machine-api-operator-5694c8668f-pxx78\" (UID: \"6e131998-c782-4fda-bd5a-0fe0255c143d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pxx78" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.555352 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.555607 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dd43ed89-2634-4c7a-a030-a468d0b1c481-registry-tls\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.556011 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.558781 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/37e33b34-0cc8-402b-b214-59e62bf03cfb-encryption-config\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.565194 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.568960 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqn2n\" (UniqueName: \"kubernetes.io/projected/37e33b34-0cc8-402b-b214-59e62bf03cfb-kube-api-access-jqn2n\") pod \"apiserver-7bbb656c7d-lptdv\" (UID: \"37e33b34-0cc8-402b-b214-59e62bf03cfb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.589547 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68011ad7-d008-4442-9505-4ff72735198e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tldwj\" (UID: \"68011ad7-d008-4442-9505-4ff72735198e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tldwj" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.609710 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8b5c\" (UniqueName: \"kubernetes.io/projected/5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d-kube-api-access-s8b5c\") pod \"machine-approver-56656f9798-chlnd\" (UID: \"5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chlnd" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.631064 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tldwj" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.633655 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmdgt\" (UniqueName: \"kubernetes.io/projected/2b91e7c6-743f-4b13-a9e9-a59dd125eaaa-kube-api-access-bmdgt\") pod \"openshift-apiserver-operator-796bbdcf4f-mzkdl\" (UID: \"2b91e7c6-743f-4b13-a9e9-a59dd125eaaa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mzkdl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.636610 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.636845 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rvvr\" (UniqueName: \"kubernetes.io/projected/c8bdc47f-6d95-402a-95a7-2f61af7a65ee-kube-api-access-4rvvr\") pod \"service-ca-operator-777779d784-4w7fl\" (UID: \"c8bdc47f-6d95-402a-95a7-2f61af7a65ee\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4w7fl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.636888 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwfzh\" (UniqueName: \"kubernetes.io/projected/d3eefc5b-29c8-4016-83b9-f178e1de9a52-kube-api-access-fwfzh\") pod \"packageserver-d55dfcdfc-6kdtm\" (UID: \"d3eefc5b-29c8-4016-83b9-f178e1de9a52\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.636915 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d3eefc5b-29c8-4016-83b9-f178e1de9a52-webhook-cert\") pod \"packageserver-d55dfcdfc-6kdtm\" (UID: \"d3eefc5b-29c8-4016-83b9-f178e1de9a52\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.636938 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f1854f8f-c17d-479c-b0f4-46159b9d0da8-signing-cabundle\") pod \"service-ca-9c57cc56f-h4mp7\" (UID: \"f1854f8f-c17d-479c-b0f4-46159b9d0da8\") " pod="openshift-service-ca/service-ca-9c57cc56f-h4mp7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.636962 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g697\" (UniqueName: \"kubernetes.io/projected/b654495a-8604-48ee-9a86-32c178849124-kube-api-access-7g697\") pod \"machine-config-server-n2g7g\" (UID: \"b654495a-8604-48ee-9a86-32c178849124\") " pod="openshift-machine-config-operator/machine-config-server-n2g7g" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.636986 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d09752b-8e55-4212-b8b7-81adee9ffa6b-cert\") pod \"ingress-canary-r6b4b\" (UID: \"4d09752b-8e55-4212-b8b7-81adee9ffa6b\") " pod="openshift-ingress-canary/ingress-canary-r6b4b" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637010 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b412aae4-dae9-42ff-96e6-8e982f62eb11-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-48gkw\" (UID: \"b412aae4-dae9-42ff-96e6-8e982f62eb11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-48gkw" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637034 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67pdr\" (UniqueName: \"kubernetes.io/projected/dabc4264-eb11-4a46-ad16-75e286b8f776-kube-api-access-67pdr\") pod \"marketplace-operator-79b997595-6nml7\" (UID: \"dabc4264-eb11-4a46-ad16-75e286b8f776\") " pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637055 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/cd55dbfe-4f78-44d1-85a2-7871197c0874-csi-data-dir\") pod \"csi-hostpathplugin-5nm8c\" (UID: \"cd55dbfe-4f78-44d1-85a2-7871197c0874\") " pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637080 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95ec09f6-783a-4661-9312-2d8f96a33825-config\") pod \"kube-apiserver-operator-766d6c64bb-tv47r\" (UID: \"95ec09f6-783a-4661-9312-2d8f96a33825\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tv47r" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637103 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f1854f8f-c17d-479c-b0f4-46159b9d0da8-signing-key\") pod \"service-ca-9c57cc56f-h4mp7\" (UID: \"f1854f8f-c17d-479c-b0f4-46159b9d0da8\") " pod="openshift-service-ca/service-ca-9c57cc56f-h4mp7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637130 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74ac0fc1-580a-4095-a2aa-c4a5c1796ae0-metrics-tls\") pod \"dns-default-hjqhw\" (UID: \"74ac0fc1-580a-4095-a2aa-c4a5c1796ae0\") " pod="openshift-dns/dns-default-hjqhw" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637151 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd490e7f-4d64-4fc3-9c8d-17236968eafd-config-volume\") pod \"collect-profiles-29534925-7hrlx\" (UID: \"dd490e7f-4d64-4fc3-9c8d-17236968eafd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29534925-7hrlx" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637180 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42krz\" (UniqueName: \"kubernetes.io/projected/0af8dc28-2edf-4a81-bcb9-a5d54ccf8174-kube-api-access-42krz\") pod \"package-server-manager-789f6589d5-fgfsf\" (UID: \"0af8dc28-2edf-4a81-bcb9-a5d54ccf8174\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fgfsf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637204 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7bbf476-7ed6-40eb-ad7b-28ed2f1286fc-proxy-tls\") pod \"machine-config-controller-84d6567774-8cn5f\" (UID: \"e7bbf476-7ed6-40eb-ad7b-28ed2f1286fc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8cn5f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637258 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dabc4264-eb11-4a46-ad16-75e286b8f776-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6nml7\" (UID: \"dabc4264-eb11-4a46-ad16-75e286b8f776\") " pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637281 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvxvq\" (UniqueName: \"kubernetes.io/projected/4bae7398-7181-4ce3-bc77-e1bafe05fe52-kube-api-access-rvxvq\") pod \"multus-admission-controller-857f4d67dd-c982t\" (UID: \"4bae7398-7181-4ce3-bc77-e1bafe05fe52\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c982t" Feb 26 08:47:54 crc kubenswrapper[4925]: E0226 08:47:54.637362 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:55.13732349 +0000 UTC m=+207.276897813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637453 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znrhn\" (UniqueName: \"kubernetes.io/projected/f1854f8f-c17d-479c-b0f4-46159b9d0da8-kube-api-access-znrhn\") pod \"service-ca-9c57cc56f-h4mp7\" (UID: \"f1854f8f-c17d-479c-b0f4-46159b9d0da8\") " pod="openshift-service-ca/service-ca-9c57cc56f-h4mp7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637477 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xr54\" (UniqueName: \"kubernetes.io/projected/b412aae4-dae9-42ff-96e6-8e982f62eb11-kube-api-access-7xr54\") pod \"kube-storage-version-migrator-operator-b67b599dd-48gkw\" (UID: \"b412aae4-dae9-42ff-96e6-8e982f62eb11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-48gkw" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637510 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e7bbf476-7ed6-40eb-ad7b-28ed2f1286fc-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8cn5f\" (UID: \"e7bbf476-7ed6-40eb-ad7b-28ed2f1286fc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8cn5f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637552 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8bdc47f-6d95-402a-95a7-2f61af7a65ee-serving-cert\") pod \"service-ca-operator-777779d784-4w7fl\" (UID: \"c8bdc47f-6d95-402a-95a7-2f61af7a65ee\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4w7fl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637588 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd490e7f-4d64-4fc3-9c8d-17236968eafd-secret-volume\") pod \"collect-profiles-29534925-7hrlx\" (UID: \"dd490e7f-4d64-4fc3-9c8d-17236968eafd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29534925-7hrlx" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637606 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d3eefc5b-29c8-4016-83b9-f178e1de9a52-tmpfs\") pod \"packageserver-d55dfcdfc-6kdtm\" (UID: \"d3eefc5b-29c8-4016-83b9-f178e1de9a52\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637624 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/cd55dbfe-4f78-44d1-85a2-7871197c0874-mountpoint-dir\") pod \"csi-hostpathplugin-5nm8c\" (UID: \"cd55dbfe-4f78-44d1-85a2-7871197c0874\") " pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637653 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6n9c\" (UniqueName: \"kubernetes.io/projected/dd490e7f-4d64-4fc3-9c8d-17236968eafd-kube-api-access-s6n9c\") pod \"collect-profiles-29534925-7hrlx\" (UID: \"dd490e7f-4d64-4fc3-9c8d-17236968eafd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29534925-7hrlx" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637673 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzw6l\" (UniqueName: \"kubernetes.io/projected/9cd6715c-c655-46b0-8dd1-b8bfcb253db4-kube-api-access-lzw6l\") pod \"migrator-59844c95c7-4pmw2\" (UID: \"9cd6715c-c655-46b0-8dd1-b8bfcb253db4\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4pmw2" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637694 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b412aae4-dae9-42ff-96e6-8e982f62eb11-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-48gkw\" (UID: \"b412aae4-dae9-42ff-96e6-8e982f62eb11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-48gkw" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637716 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cd55dbfe-4f78-44d1-85a2-7871197c0874-socket-dir\") pod \"csi-hostpathplugin-5nm8c\" (UID: \"cd55dbfe-4f78-44d1-85a2-7871197c0874\") " pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637732 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24e97113-beed-4628-94f1-bff371e2fecd-proxy-tls\") pod \"machine-config-operator-74547568cd-k2dxr\" (UID: \"24e97113-beed-4628-94f1-bff371e2fecd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k2dxr" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637756 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfc9b\" (UniqueName: \"kubernetes.io/projected/24e97113-beed-4628-94f1-bff371e2fecd-kube-api-access-sfc9b\") pod \"machine-config-operator-74547568cd-k2dxr\" (UID: \"24e97113-beed-4628-94f1-bff371e2fecd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k2dxr" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637774 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4bae7398-7181-4ce3-bc77-e1bafe05fe52-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-c982t\" (UID: \"4bae7398-7181-4ce3-bc77-e1bafe05fe52\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c982t" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637796 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24e97113-beed-4628-94f1-bff371e2fecd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-k2dxr\" (UID: \"24e97113-beed-4628-94f1-bff371e2fecd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k2dxr" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637812 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n29x\" (UniqueName: \"kubernetes.io/projected/74ac0fc1-580a-4095-a2aa-c4a5c1796ae0-kube-api-access-2n29x\") pod \"dns-default-hjqhw\" (UID: \"74ac0fc1-580a-4095-a2aa-c4a5c1796ae0\") " pod="openshift-dns/dns-default-hjqhw" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637831 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74ac0fc1-580a-4095-a2aa-c4a5c1796ae0-config-volume\") pod \"dns-default-hjqhw\" (UID: \"74ac0fc1-580a-4095-a2aa-c4a5c1796ae0\") " pod="openshift-dns/dns-default-hjqhw" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637848 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mshh5\" (UniqueName: \"kubernetes.io/projected/26d0f330-c397-4f6a-8478-a706dc0c1af2-kube-api-access-mshh5\") pod \"catalog-operator-68c6474976-k78cx\" (UID: \"26d0f330-c397-4f6a-8478-a706dc0c1af2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k78cx" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637872 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbhq6\" (UniqueName: \"kubernetes.io/projected/cd55dbfe-4f78-44d1-85a2-7871197c0874-kube-api-access-tbhq6\") pod \"csi-hostpathplugin-5nm8c\" (UID: \"cd55dbfe-4f78-44d1-85a2-7871197c0874\") " pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637891 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637907 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95ec09f6-783a-4661-9312-2d8f96a33825-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tv47r\" (UID: \"95ec09f6-783a-4661-9312-2d8f96a33825\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tv47r" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637922 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d3eefc5b-29c8-4016-83b9-f178e1de9a52-apiservice-cert\") pod \"packageserver-d55dfcdfc-6kdtm\" (UID: \"d3eefc5b-29c8-4016-83b9-f178e1de9a52\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637938 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/26d0f330-c397-4f6a-8478-a706dc0c1af2-profile-collector-cert\") pod \"catalog-operator-68c6474976-k78cx\" (UID: \"26d0f330-c397-4f6a-8478-a706dc0c1af2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k78cx" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637954 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/26d0f330-c397-4f6a-8478-a706dc0c1af2-srv-cert\") pod \"catalog-operator-68c6474976-k78cx\" (UID: \"26d0f330-c397-4f6a-8478-a706dc0c1af2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k78cx" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637971 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95ec09f6-783a-4661-9312-2d8f96a33825-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tv47r\" (UID: \"95ec09f6-783a-4661-9312-2d8f96a33825\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tv47r" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.637986 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt6nm\" (UniqueName: \"kubernetes.io/projected/4d09752b-8e55-4212-b8b7-81adee9ffa6b-kube-api-access-dt6nm\") pod \"ingress-canary-r6b4b\" (UID: \"4d09752b-8e55-4212-b8b7-81adee9ffa6b\") " pod="openshift-ingress-canary/ingress-canary-r6b4b" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.638002 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/24e97113-beed-4628-94f1-bff371e2fecd-images\") pod \"machine-config-operator-74547568cd-k2dxr\" (UID: \"24e97113-beed-4628-94f1-bff371e2fecd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k2dxr" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.638021 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62vcq\" (UniqueName: \"kubernetes.io/projected/3fa6dc4c-4a71-427a-9a01-567e0d4c0325-kube-api-access-62vcq\") pod \"olm-operator-6b444d44fb-dd52h\" (UID: \"3fa6dc4c-4a71-427a-9a01-567e0d4c0325\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dd52h" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.638041 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0af8dc28-2edf-4a81-bcb9-a5d54ccf8174-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-fgfsf\" (UID: \"0af8dc28-2edf-4a81-bcb9-a5d54ccf8174\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fgfsf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.638060 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gg9g\" (UniqueName: \"kubernetes.io/projected/0e938a2d-afc6-4360-aa50-e93c71d2c8c3-kube-api-access-8gg9g\") pod \"auto-csr-approver-29534926-k666h\" (UID: \"0e938a2d-afc6-4360-aa50-e93c71d2c8c3\") " pod="openshift-infra/auto-csr-approver-29534926-k666h" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.638080 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dabc4264-eb11-4a46-ad16-75e286b8f776-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6nml7\" (UID: \"dabc4264-eb11-4a46-ad16-75e286b8f776\") " pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.638097 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8bdc47f-6d95-402a-95a7-2f61af7a65ee-config\") pod \"service-ca-operator-777779d784-4w7fl\" (UID: \"c8bdc47f-6d95-402a-95a7-2f61af7a65ee\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4w7fl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.638112 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3fa6dc4c-4a71-427a-9a01-567e0d4c0325-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dd52h\" (UID: \"3fa6dc4c-4a71-427a-9a01-567e0d4c0325\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dd52h" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.638127 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b654495a-8604-48ee-9a86-32c178849124-node-bootstrap-token\") pod \"machine-config-server-n2g7g\" (UID: \"b654495a-8604-48ee-9a86-32c178849124\") " pod="openshift-machine-config-operator/machine-config-server-n2g7g" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.638141 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b654495a-8604-48ee-9a86-32c178849124-certs\") pod \"machine-config-server-n2g7g\" (UID: \"b654495a-8604-48ee-9a86-32c178849124\") " pod="openshift-machine-config-operator/machine-config-server-n2g7g" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.638157 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpb57\" (UniqueName: \"kubernetes.io/projected/e7bbf476-7ed6-40eb-ad7b-28ed2f1286fc-kube-api-access-tpb57\") pod \"machine-config-controller-84d6567774-8cn5f\" (UID: \"e7bbf476-7ed6-40eb-ad7b-28ed2f1286fc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8cn5f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.638182 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cd55dbfe-4f78-44d1-85a2-7871197c0874-registration-dir\") pod \"csi-hostpathplugin-5nm8c\" (UID: \"cd55dbfe-4f78-44d1-85a2-7871197c0874\") " pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.638199 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/cd55dbfe-4f78-44d1-85a2-7871197c0874-plugins-dir\") pod \"csi-hostpathplugin-5nm8c\" (UID: \"cd55dbfe-4f78-44d1-85a2-7871197c0874\") " pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.638228 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3fa6dc4c-4a71-427a-9a01-567e0d4c0325-srv-cert\") pod \"olm-operator-6b444d44fb-dd52h\" (UID: \"3fa6dc4c-4a71-427a-9a01-567e0d4c0325\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dd52h" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.641225 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e7bbf476-7ed6-40eb-ad7b-28ed2f1286fc-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8cn5f\" (UID: \"e7bbf476-7ed6-40eb-ad7b-28ed2f1286fc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8cn5f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.641624 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3fa6dc4c-4a71-427a-9a01-567e0d4c0325-srv-cert\") pod \"olm-operator-6b444d44fb-dd52h\" (UID: \"3fa6dc4c-4a71-427a-9a01-567e0d4c0325\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dd52h" Feb 26 08:47:54 crc kubenswrapper[4925]: E0226 08:47:54.641873 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:55.141860171 +0000 UTC m=+207.281434664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.642322 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d3eefc5b-29c8-4016-83b9-f178e1de9a52-webhook-cert\") pod \"packageserver-d55dfcdfc-6kdtm\" (UID: \"d3eefc5b-29c8-4016-83b9-f178e1de9a52\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.643218 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f1854f8f-c17d-479c-b0f4-46159b9d0da8-signing-cabundle\") pod \"service-ca-9c57cc56f-h4mp7\" (UID: \"f1854f8f-c17d-479c-b0f4-46159b9d0da8\") " pod="openshift-service-ca/service-ca-9c57cc56f-h4mp7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.645401 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95ec09f6-783a-4661-9312-2d8f96a33825-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-tv47r\" (UID: \"95ec09f6-783a-4661-9312-2d8f96a33825\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tv47r" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.646845 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d09752b-8e55-4212-b8b7-81adee9ffa6b-cert\") pod \"ingress-canary-r6b4b\" (UID: \"4d09752b-8e55-4212-b8b7-81adee9ffa6b\") " pod="openshift-ingress-canary/ingress-canary-r6b4b" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.647227 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8bdc47f-6d95-402a-95a7-2f61af7a65ee-serving-cert\") pod \"service-ca-operator-777779d784-4w7fl\" (UID: \"c8bdc47f-6d95-402a-95a7-2f61af7a65ee\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4w7fl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.650022 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b412aae4-dae9-42ff-96e6-8e982f62eb11-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-48gkw\" (UID: \"b412aae4-dae9-42ff-96e6-8e982f62eb11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-48gkw" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.650217 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/cd55dbfe-4f78-44d1-85a2-7871197c0874-csi-data-dir\") pod \"csi-hostpathplugin-5nm8c\" (UID: \"cd55dbfe-4f78-44d1-85a2-7871197c0874\") " pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.650771 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95ec09f6-783a-4661-9312-2d8f96a33825-config\") pod \"kube-apiserver-operator-766d6c64bb-tv47r\" (UID: \"95ec09f6-783a-4661-9312-2d8f96a33825\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tv47r" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.651601 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d3eefc5b-29c8-4016-83b9-f178e1de9a52-apiservice-cert\") pod \"packageserver-d55dfcdfc-6kdtm\" (UID: \"d3eefc5b-29c8-4016-83b9-f178e1de9a52\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.651756 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/24e97113-beed-4628-94f1-bff371e2fecd-images\") pod \"machine-config-operator-74547568cd-k2dxr\" (UID: \"24e97113-beed-4628-94f1-bff371e2fecd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k2dxr" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.652501 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dabc4264-eb11-4a46-ad16-75e286b8f776-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6nml7\" (UID: \"dabc4264-eb11-4a46-ad16-75e286b8f776\") " pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.652690 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dabc4264-eb11-4a46-ad16-75e286b8f776-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6nml7\" (UID: \"dabc4264-eb11-4a46-ad16-75e286b8f776\") " pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.653069 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d90b9c6c-9c34-4d18-bda8-0bf6e482e71b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-f5g2n\" (UID: \"d90b9c6c-9c34-4d18-bda8-0bf6e482e71b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f5g2n" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.653320 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8bdc47f-6d95-402a-95a7-2f61af7a65ee-config\") pod \"service-ca-operator-777779d784-4w7fl\" (UID: \"c8bdc47f-6d95-402a-95a7-2f61af7a65ee\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4w7fl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.653650 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cd55dbfe-4f78-44d1-85a2-7871197c0874-registration-dir\") pod \"csi-hostpathplugin-5nm8c\" (UID: \"cd55dbfe-4f78-44d1-85a2-7871197c0874\") " pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.653715 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/cd55dbfe-4f78-44d1-85a2-7871197c0874-plugins-dir\") pod \"csi-hostpathplugin-5nm8c\" (UID: \"cd55dbfe-4f78-44d1-85a2-7871197c0874\") " pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.655277 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d3eefc5b-29c8-4016-83b9-f178e1de9a52-tmpfs\") pod \"packageserver-d55dfcdfc-6kdtm\" (UID: \"d3eefc5b-29c8-4016-83b9-f178e1de9a52\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.655326 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74ac0fc1-580a-4095-a2aa-c4a5c1796ae0-metrics-tls\") pod \"dns-default-hjqhw\" (UID: \"74ac0fc1-580a-4095-a2aa-c4a5c1796ae0\") " pod="openshift-dns/dns-default-hjqhw" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.655380 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/cd55dbfe-4f78-44d1-85a2-7871197c0874-mountpoint-dir\") pod \"csi-hostpathplugin-5nm8c\" (UID: \"cd55dbfe-4f78-44d1-85a2-7871197c0874\") " pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.655449 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cd55dbfe-4f78-44d1-85a2-7871197c0874-socket-dir\") pod \"csi-hostpathplugin-5nm8c\" (UID: \"cd55dbfe-4f78-44d1-85a2-7871197c0874\") " pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.655979 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b412aae4-dae9-42ff-96e6-8e982f62eb11-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-48gkw\" (UID: \"b412aae4-dae9-42ff-96e6-8e982f62eb11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-48gkw" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.656106 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/24e97113-beed-4628-94f1-bff371e2fecd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-k2dxr\" (UID: \"24e97113-beed-4628-94f1-bff371e2fecd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k2dxr" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.656765 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd490e7f-4d64-4fc3-9c8d-17236968eafd-config-volume\") pod \"collect-profiles-29534925-7hrlx\" (UID: \"dd490e7f-4d64-4fc3-9c8d-17236968eafd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29534925-7hrlx" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.658135 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74ac0fc1-580a-4095-a2aa-c4a5c1796ae0-config-volume\") pod \"dns-default-hjqhw\" (UID: \"74ac0fc1-580a-4095-a2aa-c4a5c1796ae0\") " pod="openshift-dns/dns-default-hjqhw" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.659932 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f1854f8f-c17d-479c-b0f4-46159b9d0da8-signing-key\") pod \"service-ca-9c57cc56f-h4mp7\" (UID: \"f1854f8f-c17d-479c-b0f4-46159b9d0da8\") " pod="openshift-service-ca/service-ca-9c57cc56f-h4mp7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.662485 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b654495a-8604-48ee-9a86-32c178849124-node-bootstrap-token\") pod \"machine-config-server-n2g7g\" (UID: \"b654495a-8604-48ee-9a86-32c178849124\") " pod="openshift-machine-config-operator/machine-config-server-n2g7g" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.665410 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/26d0f330-c397-4f6a-8478-a706dc0c1af2-srv-cert\") pod \"catalog-operator-68c6474976-k78cx\" (UID: \"26d0f330-c397-4f6a-8478-a706dc0c1af2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k78cx" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.666170 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7bbf476-7ed6-40eb-ad7b-28ed2f1286fc-proxy-tls\") pod \"machine-config-controller-84d6567774-8cn5f\" (UID: \"e7bbf476-7ed6-40eb-ad7b-28ed2f1286fc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8cn5f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.666882 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0af8dc28-2edf-4a81-bcb9-a5d54ccf8174-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-fgfsf\" (UID: \"0af8dc28-2edf-4a81-bcb9-a5d54ccf8174\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fgfsf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.667050 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4bae7398-7181-4ce3-bc77-e1bafe05fe52-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-c982t\" (UID: \"4bae7398-7181-4ce3-bc77-e1bafe05fe52\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c982t" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.667120 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/26d0f330-c397-4f6a-8478-a706dc0c1af2-profile-collector-cert\") pod \"catalog-operator-68c6474976-k78cx\" (UID: \"26d0f330-c397-4f6a-8478-a706dc0c1af2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k78cx" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.669023 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24e97113-beed-4628-94f1-bff371e2fecd-proxy-tls\") pod \"machine-config-operator-74547568cd-k2dxr\" (UID: \"24e97113-beed-4628-94f1-bff371e2fecd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k2dxr" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.669089 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b654495a-8604-48ee-9a86-32c178849124-certs\") pod \"machine-config-server-n2g7g\" (UID: \"b654495a-8604-48ee-9a86-32c178849124\") " pod="openshift-machine-config-operator/machine-config-server-n2g7g" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.669569 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd490e7f-4d64-4fc3-9c8d-17236968eafd-secret-volume\") pod \"collect-profiles-29534925-7hrlx\" (UID: \"dd490e7f-4d64-4fc3-9c8d-17236968eafd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29534925-7hrlx" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.670078 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3fa6dc4c-4a71-427a-9a01-567e0d4c0325-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dd52h\" (UID: \"3fa6dc4c-4a71-427a-9a01-567e0d4c0325\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dd52h" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.674374 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvv2h\" (UniqueName: \"kubernetes.io/projected/705f399b-a347-4aa1-a502-7b9720e67d71-kube-api-access-nvv2h\") pod \"authentication-operator-69f744f599-bpqx7\" (UID: \"705f399b-a347-4aa1-a502-7b9720e67d71\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bpqx7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.732707 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-gl96d"] Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.733669 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctf6s\" (UniqueName: \"kubernetes.io/projected/0fc3c69d-10be-4495-a469-8b975f26754f-kube-api-access-ctf6s\") pod \"router-default-5444994796-h5g2f\" (UID: \"0fc3c69d-10be-4495-a469-8b975f26754f\") " pod="openshift-ingress/router-default-5444994796-h5g2f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.739231 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:54 crc kubenswrapper[4925]: E0226 08:47:54.740082 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:55.2400557 +0000 UTC m=+207.379629983 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.750412 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-94fn2"] Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.767357 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84kkq\" (UniqueName: \"kubernetes.io/projected/3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e-kube-api-access-84kkq\") pod \"etcd-operator-b45778765-djlxf\" (UID: \"3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-djlxf" Feb 26 08:47:54 crc kubenswrapper[4925]: W0226 08:47:54.772805 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a7e81a4_3454_4467_8355_9a19f0274073.slice/crio-bac7fca36cea5de0ecfea02d2b6c7f2a7815da961da3527224bc3658ca31d2b9 WatchSource:0}: Error finding container bac7fca36cea5de0ecfea02d2b6c7f2a7815da961da3527224bc3658ca31d2b9: Status 404 returned error can't find the container with id bac7fca36cea5de0ecfea02d2b6c7f2a7815da961da3527224bc3658ca31d2b9 Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.777170 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gcl4\" (UniqueName: \"kubernetes.io/projected/c637160e-e623-415a-b0e4-9277dee94439-kube-api-access-4gcl4\") pod \"oauth-openshift-558db77b4-btcxz\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.784849 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdr9z"] Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.786362 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4zhr\" (UniqueName: \"kubernetes.io/projected/dd43ed89-2634-4c7a-a030-a468d0b1c481-kube-api-access-x4zhr\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.794937 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb"] Feb 26 08:47:54 crc kubenswrapper[4925]: W0226 08:47:54.801993 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-ffc854f748277b06fd41031666a59fd4583b4cf89eaeaf48a5d6b1c06f17f23f WatchSource:0}: Error finding container ffc854f748277b06fd41031666a59fd4583b4cf89eaeaf48a5d6b1c06f17f23f: Status 404 returned error can't find the container with id ffc854f748277b06fd41031666a59fd4583b4cf89eaeaf48a5d6b1c06f17f23f Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.807939 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd43ed89-2634-4c7a-a030-a468d0b1c481-bound-sa-token\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.813715 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqlbp\" (UniqueName: \"kubernetes.io/projected/fa394c7c-78af-4696-8fc3-73e8301effd6-kube-api-access-sqlbp\") pod \"dns-operator-744455d44c-nsvcl\" (UID: \"fa394c7c-78af-4696-8fc3-73e8301effd6\") " pod="openshift-dns-operator/dns-operator-744455d44c-nsvcl" Feb 26 08:47:54 crc kubenswrapper[4925]: W0226 08:47:54.813898 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60dcb438_eb57_425d_a9a0_c1dfe9a884e2.slice/crio-b7f3982b47bf25f4da2eed832850c63f13c1476afcd0e6eb976f6ce91bc7d037 WatchSource:0}: Error finding container b7f3982b47bf25f4da2eed832850c63f13c1476afcd0e6eb976f6ce91bc7d037: Status 404 returned error can't find the container with id b7f3982b47bf25f4da2eed832850c63f13c1476afcd0e6eb976f6ce91bc7d037 Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.814037 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-kv2xl"] Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.814682 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.816573 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-96njb"] Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.818692 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jwb2"] Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.820564 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-b5l7b"] Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.823443 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chlnd" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.830364 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-pxx78" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.841637 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: E0226 08:47:54.842186 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:55.342166594 +0000 UTC m=+207.481740877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.848890 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57w97\" (UniqueName: \"kubernetes.io/projected/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-kube-api-access-57w97\") pod \"controller-manager-879f6c89f-8hj5v\" (UID: \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.853479 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rk9z\" (UniqueName: \"kubernetes.io/projected/d90b9c6c-9c34-4d18-bda8-0bf6e482e71b-kube-api-access-2rk9z\") pod \"ingress-operator-5b745b69d9-f5g2n\" (UID: \"d90b9c6c-9c34-4d18-bda8-0bf6e482e71b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f5g2n" Feb 26 08:47:54 crc kubenswrapper[4925]: W0226 08:47:54.854953 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c3241c5_45b5_464d_866b_690bcedf1934.slice/crio-14d8384811ba60254b32ba9639909194abd3f512e71e3aeeaf98d2eb741fa290 WatchSource:0}: Error finding container 14d8384811ba60254b32ba9639909194abd3f512e71e3aeeaf98d2eb741fa290: Status 404 returned error can't find the container with id 14d8384811ba60254b32ba9639909194abd3f512e71e3aeeaf98d2eb741fa290 Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.858238 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nkgtm"] Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.860683 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.866336 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mzkdl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.870153 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c4fd3e3-031e-4c99-96f1-a0a5c84e92e9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6mjmn\" (UID: \"5c4fd3e3-031e-4c99-96f1-a0a5c84e92e9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6mjmn" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.881047 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nsvcl" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.887139 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6mjmn" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.888430 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xztd\" (UniqueName: \"kubernetes.io/projected/b59f8920-d5ae-4e79-9923-75219a4e00fa-kube-api-access-7xztd\") pod \"console-operator-58897d9998-t62dr\" (UID: \"b59f8920-d5ae-4e79-9923-75219a4e00fa\") " pod="openshift-console-operator/console-operator-58897d9998-t62dr" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.891886 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.913383 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxkn9\" (UniqueName: \"kubernetes.io/projected/996aef4b-8460-499a-82ff-eb4fc29e8ca5-kube-api-access-kxkn9\") pod \"openshift-controller-manager-operator-756b6f6bc6-f6tqj\" (UID: \"996aef4b-8460-499a-82ff-eb4fc29e8ca5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6tqj" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.920477 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-h5g2f" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.933908 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwfzh\" (UniqueName: \"kubernetes.io/projected/d3eefc5b-29c8-4016-83b9-f178e1de9a52-kube-api-access-fwfzh\") pod \"packageserver-d55dfcdfc-6kdtm\" (UID: \"d3eefc5b-29c8-4016-83b9-f178e1de9a52\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.935822 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tldwj"] Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.945587 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:54 crc kubenswrapper[4925]: E0226 08:47:54.945777 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:55.445743514 +0000 UTC m=+207.585317797 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.946064 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:54 crc kubenswrapper[4925]: E0226 08:47:54.946678 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:55.446665796 +0000 UTC m=+207.586240079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.951362 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvxvq\" (UniqueName: \"kubernetes.io/projected/4bae7398-7181-4ce3-bc77-e1bafe05fe52-kube-api-access-rvxvq\") pod \"multus-admission-controller-857f4d67dd-c982t\" (UID: \"4bae7398-7181-4ce3-bc77-e1bafe05fe52\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c982t" Feb 26 08:47:54 crc kubenswrapper[4925]: W0226 08:47:54.961053 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b52ddb4_6e78_4b20_bccf_0b3fab8fc71d.slice/crio-82369a287f9e210e258654b021109c9a23df7c54d5bc3e795ae4e96e0ef3dbb3 WatchSource:0}: Error finding container 82369a287f9e210e258654b021109c9a23df7c54d5bc3e795ae4e96e0ef3dbb3: Status 404 returned error can't find the container with id 82369a287f9e210e258654b021109c9a23df7c54d5bc3e795ae4e96e0ef3dbb3 Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.967854 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbhq6\" (UniqueName: \"kubernetes.io/projected/cd55dbfe-4f78-44d1-85a2-7871197c0874-kube-api-access-tbhq6\") pod \"csi-hostpathplugin-5nm8c\" (UID: \"cd55dbfe-4f78-44d1-85a2-7871197c0874\") " pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.968659 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bpqx7" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.973169 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-c982t" Feb 26 08:47:54 crc kubenswrapper[4925]: W0226 08:47:54.974832 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68011ad7_d008_4442_9505_4ff72735198e.slice/crio-edd05cb6268dc7797221ae539001b1b302afc49c9973826169137bfa6849e09e WatchSource:0}: Error finding container edd05cb6268dc7797221ae539001b1b302afc49c9973826169137bfa6849e09e: Status 404 returned error can't find the container with id edd05cb6268dc7797221ae539001b1b302afc49c9973826169137bfa6849e09e Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.985722 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-djlxf" Feb 26 08:47:54 crc kubenswrapper[4925]: I0226 08:47:54.989033 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rvvr\" (UniqueName: \"kubernetes.io/projected/c8bdc47f-6d95-402a-95a7-2f61af7a65ee-kube-api-access-4rvvr\") pod \"service-ca-operator-777779d784-4w7fl\" (UID: \"c8bdc47f-6d95-402a-95a7-2f61af7a65ee\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4w7fl" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.011666 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znrhn\" (UniqueName: \"kubernetes.io/projected/f1854f8f-c17d-479c-b0f4-46159b9d0da8-kube-api-access-znrhn\") pod \"service-ca-9c57cc56f-h4mp7\" (UID: \"f1854f8f-c17d-479c-b0f4-46159b9d0da8\") " pod="openshift-service-ca/service-ca-9c57cc56f-h4mp7" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.024115 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.029380 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4w7fl" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.035501 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xr54\" (UniqueName: \"kubernetes.io/projected/b412aae4-dae9-42ff-96e6-8e982f62eb11-kube-api-access-7xr54\") pod \"kube-storage-version-migrator-operator-b67b599dd-48gkw\" (UID: \"b412aae4-dae9-42ff-96e6-8e982f62eb11\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-48gkw" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.048256 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:55 crc kubenswrapper[4925]: E0226 08:47:55.048440 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:55.548397721 +0000 UTC m=+207.687971994 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.049011 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gg9g\" (UniqueName: \"kubernetes.io/projected/0e938a2d-afc6-4360-aa50-e93c71d2c8c3-kube-api-access-8gg9g\") pod \"auto-csr-approver-29534926-k666h\" (UID: \"0e938a2d-afc6-4360-aa50-e93c71d2c8c3\") " pod="openshift-infra/auto-csr-approver-29534926-k666h" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.049024 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:55 crc kubenswrapper[4925]: E0226 08:47:55.049405 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:55.549387955 +0000 UTC m=+207.688962438 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.066481 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534926-k666h" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.078283 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g697\" (UniqueName: \"kubernetes.io/projected/b654495a-8604-48ee-9a86-32c178849124-kube-api-access-7g697\") pod \"machine-config-server-n2g7g\" (UID: \"b654495a-8604-48ee-9a86-32c178849124\") " pod="openshift-machine-config-operator/machine-config-server-n2g7g" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.082279 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6tqj" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.094320 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.094371 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67pdr\" (UniqueName: \"kubernetes.io/projected/dabc4264-eb11-4a46-ad16-75e286b8f776-kube-api-access-67pdr\") pod \"marketplace-operator-79b997595-6nml7\" (UID: \"dabc4264-eb11-4a46-ad16-75e286b8f776\") " pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.115903 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-n2g7g" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.124723 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt6nm\" (UniqueName: \"kubernetes.io/projected/4d09752b-8e55-4212-b8b7-81adee9ffa6b-kube-api-access-dt6nm\") pod \"ingress-canary-r6b4b\" (UID: \"4d09752b-8e55-4212-b8b7-81adee9ffa6b\") " pod="openshift-ingress-canary/ingress-canary-r6b4b" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.135502 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42krz\" (UniqueName: \"kubernetes.io/projected/0af8dc28-2edf-4a81-bcb9-a5d54ccf8174-kube-api-access-42krz\") pod \"package-server-manager-789f6589d5-fgfsf\" (UID: \"0af8dc28-2edf-4a81-bcb9-a5d54ccf8174\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fgfsf" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.146151 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-t62dr" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.150252 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:55 crc kubenswrapper[4925]: E0226 08:47:55.150952 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:55.650935955 +0000 UTC m=+207.790510238 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.152975 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f5g2n" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.173089 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62vcq\" (UniqueName: \"kubernetes.io/projected/3fa6dc4c-4a71-427a-9a01-567e0d4c0325-kube-api-access-62vcq\") pod \"olm-operator-6b444d44fb-dd52h\" (UID: \"3fa6dc4c-4a71-427a-9a01-567e0d4c0325\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dd52h" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.182939 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpb57\" (UniqueName: \"kubernetes.io/projected/e7bbf476-7ed6-40eb-ad7b-28ed2f1286fc-kube-api-access-tpb57\") pod \"machine-config-controller-84d6567774-8cn5f\" (UID: \"e7bbf476-7ed6-40eb-ad7b-28ed2f1286fc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8cn5f" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.189420 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv"] Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.206723 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95ec09f6-783a-4661-9312-2d8f96a33825-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-tv47r\" (UID: \"95ec09f6-783a-4661-9312-2d8f96a33825\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tv47r" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.220113 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6n9c\" (UniqueName: \"kubernetes.io/projected/dd490e7f-4d64-4fc3-9c8d-17236968eafd-kube-api-access-s6n9c\") pod \"collect-profiles-29534925-7hrlx\" (UID: \"dd490e7f-4d64-4fc3-9c8d-17236968eafd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29534925-7hrlx" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.237704 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tv47r" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.242820 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-48gkw" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.254736 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:55 crc kubenswrapper[4925]: E0226 08:47:55.255821 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:55.755797417 +0000 UTC m=+207.895371700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.255986 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8cn5f" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.266776 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.276854 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfc9b\" (UniqueName: \"kubernetes.io/projected/24e97113-beed-4628-94f1-bff371e2fecd-kube-api-access-sfc9b\") pod \"machine-config-operator-74547568cd-k2dxr\" (UID: \"24e97113-beed-4628-94f1-bff371e2fecd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k2dxr" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.280200 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzw6l\" (UniqueName: \"kubernetes.io/projected/9cd6715c-c655-46b0-8dd1-b8bfcb253db4-kube-api-access-lzw6l\") pod \"migrator-59844c95c7-4pmw2\" (UID: \"9cd6715c-c655-46b0-8dd1-b8bfcb253db4\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4pmw2" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.281090 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8hj5v"] Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.281645 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dd52h" Feb 26 08:47:55 crc kubenswrapper[4925]: W0226 08:47:55.286045 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fc3c69d_10be_4495_a469_8b975f26754f.slice/crio-45bcbcb5ab5022f53c7dc9adca79e3ff4731c833cb193cf94fdccae0eafa0b3b WatchSource:0}: Error finding container 45bcbcb5ab5022f53c7dc9adca79e3ff4731c833cb193cf94fdccae0eafa0b3b: Status 404 returned error can't find the container with id 45bcbcb5ab5022f53c7dc9adca79e3ff4731c833cb193cf94fdccae0eafa0b3b Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.289942 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k2dxr" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.295634 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mshh5\" (UniqueName: \"kubernetes.io/projected/26d0f330-c397-4f6a-8478-a706dc0c1af2-kube-api-access-mshh5\") pod \"catalog-operator-68c6474976-k78cx\" (UID: \"26d0f330-c397-4f6a-8478-a706dc0c1af2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k78cx" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.297986 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k78cx" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.306316 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-h4mp7" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.308712 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n29x\" (UniqueName: \"kubernetes.io/projected/74ac0fc1-580a-4095-a2aa-c4a5c1796ae0-kube-api-access-2n29x\") pod \"dns-default-hjqhw\" (UID: \"74ac0fc1-580a-4095-a2aa-c4a5c1796ae0\") " pod="openshift-dns/dns-default-hjqhw" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.317197 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fgfsf" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.337550 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29534925-7hrlx" Feb 26 08:47:55 crc kubenswrapper[4925]: W0226 08:47:55.353402 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffe17e7a_f03b_4b6e_93a6_be7e5bf4d31b.slice/crio-d6da8dace305dfec133cb949118bc77600f463834ef16f2466341f777ddd45a0 WatchSource:0}: Error finding container d6da8dace305dfec133cb949118bc77600f463834ef16f2466341f777ddd45a0: Status 404 returned error can't find the container with id d6da8dace305dfec133cb949118bc77600f463834ef16f2466341f777ddd45a0 Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.355851 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:55 crc kubenswrapper[4925]: E0226 08:47:55.356171 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:55.856156358 +0000 UTC m=+207.995730641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.386123 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pxx78"] Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.402294 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r6b4b" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.407244 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hjqhw" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.451365 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6mjmn"] Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.457716 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:55 crc kubenswrapper[4925]: E0226 08:47:55.458067 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:55.958052107 +0000 UTC m=+208.097626390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.510335 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jwb2" event={"ID":"60428194-e5a3-4578-872a-6d564e7b0a7c","Type":"ContainerStarted","Data":"028b43e68382d3f76b8e288161eb892131c2dbad319c4a75cce0c98e0c295ec9"} Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.510393 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jwb2" event={"ID":"60428194-e5a3-4578-872a-6d564e7b0a7c","Type":"ContainerStarted","Data":"92a263f7f22eb831104c89aaeb06c104d565ca4ed19816abbd7b641a638b67c6"} Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.513864 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nsvcl"] Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.514252 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d9f690e3c79ce10958b40f4e51e669a1d995c6c22be7b115db66c1836f3b4324"} Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.514308 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ffc854f748277b06fd41031666a59fd4583b4cf89eaeaf48a5d6b1c06f17f23f"} Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.514765 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.522425 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chlnd" event={"ID":"5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d","Type":"ContainerStarted","Data":"82369a287f9e210e258654b021109c9a23df7c54d5bc3e795ae4e96e0ef3dbb3"} Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.524561 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" event={"ID":"66d0db3d-0908-48d2-be73-e1afd790f010","Type":"ContainerStarted","Data":"0eddb0731238b9a576a17905acebe6a28a03ca0ea67884a957843b381297d3fa"} Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.526072 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-h5g2f" event={"ID":"0fc3c69d-10be-4495-a469-8b975f26754f","Type":"ContainerStarted","Data":"45bcbcb5ab5022f53c7dc9adca79e3ff4731c833cb193cf94fdccae0eafa0b3b"} Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.534181 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kv2xl" event={"ID":"7c3241c5-45b5-464d-866b-690bcedf1934","Type":"ContainerStarted","Data":"22eb370c48ee37d34ad474e51d5f8ace3b2d741a5c1cf9ec9c299c8338eb7b06"} Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.534246 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kv2xl" event={"ID":"7c3241c5-45b5-464d-866b-690bcedf1934","Type":"ContainerStarted","Data":"14d8384811ba60254b32ba9639909194abd3f512e71e3aeeaf98d2eb741fa290"} Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.536470 4925 generic.go:334] "Generic (PLEG): container finished" podID="9a7e81a4-3454-4467-8355-9a19f0274073" containerID="71ebb4b3f9dd65aff161f76967de306169240c6684ebda67258a60778778c3b6" exitCode=0 Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.536633 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gl96d" event={"ID":"9a7e81a4-3454-4467-8355-9a19f0274073","Type":"ContainerDied","Data":"71ebb4b3f9dd65aff161f76967de306169240c6684ebda67258a60778778c3b6"} Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.536731 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gl96d" event={"ID":"9a7e81a4-3454-4467-8355-9a19f0274073","Type":"ContainerStarted","Data":"bac7fca36cea5de0ecfea02d2b6c7f2a7815da961da3527224bc3658ca31d2b9"} Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.539158 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"84ebc2454c92453beda67a746350898d9e9aa598568af9657ad902e953fd4f5d"} Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.539202 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"97410e08d4562606f2beb5c47e9f90ec8fa4989f08877da3c7c25f5c2afee34e"} Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.542669 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tldwj" event={"ID":"68011ad7-d008-4442-9505-4ff72735198e","Type":"ContainerStarted","Data":"edd05cb6268dc7797221ae539001b1b302afc49c9973826169137bfa6849e09e"} Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.561293 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.561558 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4pmw2" Feb 26 08:47:55 crc kubenswrapper[4925]: E0226 08:47:55.561827 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:56.061797291 +0000 UTC m=+208.201371574 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.570332 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdr9z" event={"ID":"3423b879-0617-45e9-a197-11d3c67141b7","Type":"ContainerStarted","Data":"a7b49d86b1b3a5f88c382508a6fdd7ccf0b425cfe304480af0830122a17dd48d"} Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.579910 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ab06f836ca532edb204747ca4f38325183641b05ff4bfd49df6200bc263d3703"} Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.585371 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" event={"ID":"60dcb438-eb57-425d-a9a0-c1dfe9a884e2","Type":"ContainerStarted","Data":"8e2e48cacc19942d2679edccd58011931af997a132f38d7c7f6c53c99d077de2"} Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.585425 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" event={"ID":"60dcb438-eb57-425d-a9a0-c1dfe9a884e2","Type":"ContainerStarted","Data":"b7f3982b47bf25f4da2eed832850c63f13c1476afcd0e6eb976f6ce91bc7d037"} Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.587628 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.600052 4925 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-pv5bb container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.600115 4925 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" podUID="60dcb438-eb57-425d-a9a0-c1dfe9a884e2" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.604057 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-btcxz"] Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.606950 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-94fn2" event={"ID":"43219832-a329-442e-8bb7-200a4deff15d","Type":"ContainerStarted","Data":"ca09ded1a891dc1d4afa88fea245638cd7839b79bb4d8eccf0fc89b73a62d186"} Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.606988 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-94fn2" event={"ID":"43219832-a329-442e-8bb7-200a4deff15d","Type":"ContainerStarted","Data":"b7ebb909831ce34f6e0abef99fb66a9af48c31e19ab714abf5a8f1112e3b8b3d"} Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.610676 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-96njb" event={"ID":"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc","Type":"ContainerStarted","Data":"019690b942a73a6389ccbed9630335f211aa08068b2fa9c4bc665040ad90a9b3"} Feb 26 08:47:55 crc kubenswrapper[4925]: W0226 08:47:55.611199 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e131998_c782_4fda_bd5a_0fe0255c143d.slice/crio-a80a044b0b674d7209add44e5aa648fddde5c540c01a41d3850e22cc404b35fd WatchSource:0}: Error finding container a80a044b0b674d7209add44e5aa648fddde5c540c01a41d3850e22cc404b35fd: Status 404 returned error can't find the container with id a80a044b0b674d7209add44e5aa648fddde5c540c01a41d3850e22cc404b35fd Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.611588 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" event={"ID":"37e33b34-0cc8-402b-b214-59e62bf03cfb","Type":"ContainerStarted","Data":"f1ff8db9d19f3b88ef640f4d44560143af843b582c763573653b0456e38257cc"} Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.643391 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-b5l7b" event={"ID":"cd116d5f-0452-44f0-ac17-2a0e2e083b93","Type":"ContainerStarted","Data":"68e7d88b823db77321e1d0e63261b8811481cf4ecf8596b95bb0b099b50a604d"} Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.644457 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-b5l7b" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.654068 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" event={"ID":"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b","Type":"ContainerStarted","Data":"d6da8dace305dfec133cb949118bc77600f463834ef16f2466341f777ddd45a0"} Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.658974 4925 patch_prober.go:28] interesting pod/downloads-7954f5f757-b5l7b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.659026 4925 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-b5l7b" podUID="cd116d5f-0452-44f0-ac17-2a0e2e083b93" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.662363 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mzkdl"] Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.663413 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:55 crc kubenswrapper[4925]: E0226 08:47:55.665390 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:56.165377181 +0000 UTC m=+208.304951464 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.670275 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-c982t"] Feb 26 08:47:55 crc kubenswrapper[4925]: W0226 08:47:55.677912 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b91e7c6_743f_4b13_a9e9_a59dd125eaaa.slice/crio-6e843f92ce28998c567a7d5d2536e1d2b81ce9541e32255d1ec47b67dea0fc20 WatchSource:0}: Error finding container 6e843f92ce28998c567a7d5d2536e1d2b81ce9541e32255d1ec47b67dea0fc20: Status 404 returned error can't find the container with id 6e843f92ce28998c567a7d5d2536e1d2b81ce9541e32255d1ec47b67dea0fc20 Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.769212 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:55 crc kubenswrapper[4925]: E0226 08:47:55.770771 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:56.270736644 +0000 UTC m=+208.410310927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:55 crc kubenswrapper[4925]: W0226 08:47:55.813921 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa394c7c_78af_4696_8fc3_73e8301effd6.slice/crio-003797e5588fce485746d00d7fd680c38612c0ede2ff0ff8138ec7c00bb3455f WatchSource:0}: Error finding container 003797e5588fce485746d00d7fd680c38612c0ede2ff0ff8138ec7c00bb3455f: Status 404 returned error can't find the container with id 003797e5588fce485746d00d7fd680c38612c0ede2ff0ff8138ec7c00bb3455f Feb 26 08:47:55 crc kubenswrapper[4925]: W0226 08:47:55.823161 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc637160e_e623_415a_b0e4_9277dee94439.slice/crio-956ea3fd7df4dad1d7ec15e6538fd0c1d13feade21f6f9a3a84aca649e20fed7 WatchSource:0}: Error finding container 956ea3fd7df4dad1d7ec15e6538fd0c1d13feade21f6f9a3a84aca649e20fed7: Status 404 returned error can't find the container with id 956ea3fd7df4dad1d7ec15e6538fd0c1d13feade21f6f9a3a84aca649e20fed7 Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.873548 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:55 crc kubenswrapper[4925]: E0226 08:47:55.873981 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:56.373964165 +0000 UTC m=+208.513538448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.891985 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6nml7"] Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.942793 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bpqx7"] Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.960799 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5nm8c"] Feb 26 08:47:55 crc kubenswrapper[4925]: I0226 08:47:55.974221 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:55 crc kubenswrapper[4925]: E0226 08:47:55.974714 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:56.474694136 +0000 UTC m=+208.614268419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.079824 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:56 crc kubenswrapper[4925]: E0226 08:47:56.080712 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:56.580694905 +0000 UTC m=+208.720269188 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.202789 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:56 crc kubenswrapper[4925]: E0226 08:47:56.203320 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:56.70330334 +0000 UTC m=+208.842877623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.327516 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:56 crc kubenswrapper[4925]: E0226 08:47:56.328398 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:56.828371724 +0000 UTC m=+208.967946007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.432290 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:56 crc kubenswrapper[4925]: E0226 08:47:56.432825 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:56.932805305 +0000 UTC m=+209.072379588 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.466745 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-b5l7b" podStartSLOduration=160.466709673 podStartE2EDuration="2m40.466709673s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:56.417724597 +0000 UTC m=+208.557298880" watchObservedRunningTime="2026-02-26 08:47:56.466709673 +0000 UTC m=+208.606283956" Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.469207 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8cn5f"] Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.490273 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-djlxf"] Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.535712 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:56 crc kubenswrapper[4925]: E0226 08:47:56.536296 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:57.036271042 +0000 UTC m=+209.175845325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.558835 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9jwb2" podStartSLOduration=160.558799143 podStartE2EDuration="2m40.558799143s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:56.550213143 +0000 UTC m=+208.689787436" watchObservedRunningTime="2026-02-26 08:47:56.558799143 +0000 UTC m=+208.698373436" Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.638055 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:56 crc kubenswrapper[4925]: E0226 08:47:56.638387 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:57.138358226 +0000 UTC m=+209.277932509 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.719419 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"9c30fff232696e329776556f2692000a302ad1cf6f7ed6108e91c12cb4a28dd3"} Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.722296 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" event={"ID":"c637160e-e623-415a-b0e4-9277dee94439","Type":"ContainerStarted","Data":"956ea3fd7df4dad1d7ec15e6538fd0c1d13feade21f6f9a3a84aca649e20fed7"} Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.724812 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-96njb" event={"ID":"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc","Type":"ContainerStarted","Data":"8fb79b9ee7f6cb99d543f62190c1aaec6eec3bef7ddc63c80e820236d52cc1d2"} Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.727570 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6mjmn" event={"ID":"5c4fd3e3-031e-4c99-96f1-a0a5c84e92e9","Type":"ContainerStarted","Data":"e7d13f67edabba2b6c8dc388155f9076f4106a6c33070bd0431a71cf10f29b66"} Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.740175 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:56 crc kubenswrapper[4925]: E0226 08:47:56.741055 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:57.241035663 +0000 UTC m=+209.380609946 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.760434 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bpqx7" event={"ID":"705f399b-a347-4aa1-a502-7b9720e67d71","Type":"ContainerStarted","Data":"3ab92fb59148f7e7e713181ec49a5ee83e02207608d6df8f5ca7704de1423400"} Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.778363 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-n2g7g" event={"ID":"b654495a-8604-48ee-9a86-32c178849124","Type":"ContainerStarted","Data":"9c004ad83b8ac0e7c8615e777f17674373d98e66fedc474a53a011f7e337c7c4"} Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.790785 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-c982t" event={"ID":"4bae7398-7181-4ce3-bc77-e1bafe05fe52","Type":"ContainerStarted","Data":"c738738986cc2306af528bee7253f53367398b5a758bde967bc7402168a6939c"} Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.798133 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" podStartSLOduration=160.798103567 podStartE2EDuration="2m40.798103567s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:56.796366024 +0000 UTC m=+208.935940317" watchObservedRunningTime="2026-02-26 08:47:56.798103567 +0000 UTC m=+208.937677850" Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.839079 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chlnd" event={"ID":"5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d","Type":"ContainerStarted","Data":"aa438f6d47531bbebd766be28c205c9025c84bd73138fdfb507ba47ec424df59"} Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.842762 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:56 crc kubenswrapper[4925]: E0226 08:47:56.857612 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:57.357515918 +0000 UTC m=+209.497090201 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.925199 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" event={"ID":"dabc4264-eb11-4a46-ad16-75e286b8f776","Type":"ContainerStarted","Data":"c15307215d40d3c732a4cd731806534419cbcb80148c806244169bf18d5d180b"} Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.945641 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:56 crc kubenswrapper[4925]: E0226 08:47:56.946030 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:57.446017589 +0000 UTC m=+209.585591872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.978115 4925 generic.go:334] "Generic (PLEG): container finished" podID="66d0db3d-0908-48d2-be73-e1afd790f010" containerID="7f16ae2f4cd082ce09f2f94350fe010ecc4ee2ade44baf2db7f4f12d806160d3" exitCode=0 Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.978260 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" event={"ID":"66d0db3d-0908-48d2-be73-e1afd790f010","Type":"ContainerDied","Data":"7f16ae2f4cd082ce09f2f94350fe010ecc4ee2ade44baf2db7f4f12d806160d3"} Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.982086 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-kv2xl" podStartSLOduration=160.982049209 podStartE2EDuration="2m40.982049209s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:56.971302247 +0000 UTC m=+209.110876530" watchObservedRunningTime="2026-02-26 08:47:56.982049209 +0000 UTC m=+209.121623482" Feb 26 08:47:56 crc kubenswrapper[4925]: I0226 08:47:56.996055 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-94fn2" podStartSLOduration=160.996042611 podStartE2EDuration="2m40.996042611s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:56.995661572 +0000 UTC m=+209.135235855" watchObservedRunningTime="2026-02-26 08:47:56.996042611 +0000 UTC m=+209.135616894" Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.013029 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tldwj" event={"ID":"68011ad7-d008-4442-9505-4ff72735198e","Type":"ContainerStarted","Data":"f3424d4b4c768545a8c091e6f7138085419e82347d669e73179638047f2a497b"} Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.046638 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:57 crc kubenswrapper[4925]: E0226 08:47:57.047007 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:57.546984306 +0000 UTC m=+209.686558579 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.064938 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" event={"ID":"cd55dbfe-4f78-44d1-85a2-7871197c0874","Type":"ContainerStarted","Data":"c8173653b1ef2691fbcead7af4d48ac827eeccaf1ff1cdf4572cd53e5e9a0017"} Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.145801 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdr9z" event={"ID":"3423b879-0617-45e9-a197-11d3c67141b7","Type":"ContainerStarted","Data":"f6795393af73937ff92199dc9049664a64ee483bd3263c9ed80908b665755429"} Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.179033 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:57 crc kubenswrapper[4925]: E0226 08:47:57.180239 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:57.68021253 +0000 UTC m=+209.819786813 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.220865 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-b5l7b" event={"ID":"cd116d5f-0452-44f0-ac17-2a0e2e083b93","Type":"ContainerStarted","Data":"1cae393eb942810386e272cd8e2881e180b5af2aabcb6458c44d261e3548b8f4"} Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.223271 4925 patch_prober.go:28] interesting pod/downloads-7954f5f757-b5l7b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.223413 4925 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-b5l7b" podUID="cd116d5f-0452-44f0-ac17-2a0e2e083b93" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.234150 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6tqj"] Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.243193 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nsvcl" event={"ID":"fa394c7c-78af-4696-8fc3-73e8301effd6","Type":"ContainerStarted","Data":"003797e5588fce485746d00d7fd680c38612c0ede2ff0ff8138ec7c00bb3455f"} Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.281682 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pxx78" event={"ID":"6e131998-c782-4fda-bd5a-0fe0255c143d","Type":"ContainerStarted","Data":"a80a044b0b674d7209add44e5aa648fddde5c540c01a41d3850e22cc404b35fd"} Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.285382 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:57 crc kubenswrapper[4925]: E0226 08:47:57.285892 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:57.78586475 +0000 UTC m=+209.925439033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.287891 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:57 crc kubenswrapper[4925]: E0226 08:47:57.290910 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:57.790699608 +0000 UTC m=+209.930273891 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.388923 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:57 crc kubenswrapper[4925]: E0226 08:47:57.389577 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:57.889552423 +0000 UTC m=+210.029126726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.416669 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r6b4b"] Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.416709 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mzkdl" event={"ID":"2b91e7c6-743f-4b13-a9e9-a59dd125eaaa","Type":"ContainerStarted","Data":"6e843f92ce28998c567a7d5d2536e1d2b81ce9541e32255d1ec47b67dea0fc20"} Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.463918 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.482354 4925 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.483798 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm"] Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.491836 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:57 crc kubenswrapper[4925]: E0226 08:47:57.510490 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:58.010466416 +0000 UTC m=+210.150040699 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.538034 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29534926-k666h"] Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.539241 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tldwj" podStartSLOduration=161.539223209 podStartE2EDuration="2m41.539223209s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:57.29036275 +0000 UTC m=+209.429937033" watchObservedRunningTime="2026-02-26 08:47:57.539223209 +0000 UTC m=+209.678797492" Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.572897 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4w7fl"] Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.575645 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tv47r"] Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.577898 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-k2dxr"] Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.607588 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:57 crc kubenswrapper[4925]: E0226 08:47:57.607900 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:58.107884676 +0000 UTC m=+210.247458949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.617731 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-h4mp7"] Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.640205 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29534925-7hrlx"] Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.644141 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-f5g2n"] Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.652176 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fgfsf"] Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.659684 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-48gkw"] Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.660092 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k78cx"] Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.670269 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dd52h"] Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.671078 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-t62dr"] Feb 26 08:47:57 crc kubenswrapper[4925]: W0226 08:47:57.673567 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0af8dc28_2edf_4a81_bcb9_a5d54ccf8174.slice/crio-e0b3c9bb53f48ebe31d57f53ede9305a804afcccc349b69e90b011f9a10840e0 WatchSource:0}: Error finding container e0b3c9bb53f48ebe31d57f53ede9305a804afcccc349b69e90b011f9a10840e0: Status 404 returned error can't find the container with id e0b3c9bb53f48ebe31d57f53ede9305a804afcccc349b69e90b011f9a10840e0 Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.711134 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:57 crc kubenswrapper[4925]: E0226 08:47:57.711590 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:58.211577208 +0000 UTC m=+210.351151481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.713239 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4pmw2"] Feb 26 08:47:57 crc kubenswrapper[4925]: W0226 08:47:57.790220 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cd6715c_c655_46b0_8dd1_b8bfcb253db4.slice/crio-76174658da16307ad08ff2c939597bd2f98fe42bbfa799ba38a27ca9ff36fc61 WatchSource:0}: Error finding container 76174658da16307ad08ff2c939597bd2f98fe42bbfa799ba38a27ca9ff36fc61: Status 404 returned error can't find the container with id 76174658da16307ad08ff2c939597bd2f98fe42bbfa799ba38a27ca9ff36fc61 Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.790591 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hjqhw"] Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.829788 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:57 crc kubenswrapper[4925]: E0226 08:47:57.832237 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:58.332218265 +0000 UTC m=+210.471792548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:57 crc kubenswrapper[4925]: I0226 08:47:57.933880 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:57 crc kubenswrapper[4925]: E0226 08:47:57.934637 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:58.434615696 +0000 UTC m=+210.574189979 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.036656 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:58 crc kubenswrapper[4925]: E0226 08:47:58.037260 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:58.537235333 +0000 UTC m=+210.676809616 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.141612 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:58 crc kubenswrapper[4925]: E0226 08:47:58.143498 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:58.643469407 +0000 UTC m=+210.783043690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.244388 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:58 crc kubenswrapper[4925]: E0226 08:47:58.244975 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:58.744949186 +0000 UTC m=+210.884523469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.346598 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:58 crc kubenswrapper[4925]: E0226 08:47:58.349515 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:58.84949492 +0000 UTC m=+210.989069203 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.438477 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" event={"ID":"dabc4264-eb11-4a46-ad16-75e286b8f776","Type":"ContainerStarted","Data":"70fc8563458bbfe8669bd85ef9d0514f77ce897a3df52a72c89c814dcebc2c0b"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.440052 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.451212 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:58 crc kubenswrapper[4925]: E0226 08:47:58.451869 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:58.951836759 +0000 UTC m=+211.091411042 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.452161 4925 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6nml7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.452226 4925 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" podUID="dabc4264-eb11-4a46-ad16-75e286b8f776" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.456433 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" event={"ID":"66d0db3d-0908-48d2-be73-e1afd790f010","Type":"ContainerStarted","Data":"40651d6a2f14b6676b1db815a90ebb20410e85cb80dd4cbd3abbb80e5051fc1b"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.474392 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" podStartSLOduration=162.47436854 podStartE2EDuration="2m42.47436854s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:58.473502768 +0000 UTC m=+210.613077061" watchObservedRunningTime="2026-02-26 08:47:58.47436854 +0000 UTC m=+210.613942823" Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.475868 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fgfsf" event={"ID":"0af8dc28-2edf-4a81-bcb9-a5d54ccf8174","Type":"ContainerStarted","Data":"e0b3c9bb53f48ebe31d57f53ede9305a804afcccc349b69e90b011f9a10840e0"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.479556 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-n2g7g" event={"ID":"b654495a-8604-48ee-9a86-32c178849124","Type":"ContainerStarted","Data":"8eb20bddfa4f97aef9a7c67896a546525ab1671a400f87f3904b78edc75cbc58"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.511928 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-n2g7g" podStartSLOduration=6.511908607 podStartE2EDuration="6.511908607s" podCreationTimestamp="2026-02-26 08:47:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:58.503947022 +0000 UTC m=+210.643521305" watchObservedRunningTime="2026-02-26 08:47:58.511908607 +0000 UTC m=+210.651482890" Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.553772 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:58 crc kubenswrapper[4925]: E0226 08:47:58.555078 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:59.055061691 +0000 UTC m=+211.194635974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.644593 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r6b4b" event={"ID":"4d09752b-8e55-4212-b8b7-81adee9ffa6b","Type":"ContainerStarted","Data":"89190c037185929fc6da1c6fd59ca919d2cdba487789a00264cb9b723aa3c7d3"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.645024 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r6b4b" event={"ID":"4d09752b-8e55-4212-b8b7-81adee9ffa6b","Type":"ContainerStarted","Data":"182e93f81090bafbcb14d67039b7571807074e79168c01efe7a6ab1f99e3fac9"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.645043 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tv47r" event={"ID":"95ec09f6-783a-4661-9312-2d8f96a33825","Type":"ContainerStarted","Data":"11c698469f4fe27317a62637e7986af99fc0c74320a3c59dd988874868b9096f"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.645055 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k2dxr" event={"ID":"24e97113-beed-4628-94f1-bff371e2fecd","Type":"ContainerStarted","Data":"b129dfb97e8f169cc49211c30ef0d57247dfd7f92823d3ba23b505ef23f47311"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.645065 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8cn5f" event={"ID":"e7bbf476-7ed6-40eb-ad7b-28ed2f1286fc","Type":"ContainerStarted","Data":"af74487961e38b2adca05027a4fef7c3a3ea1fdbde7ffdfd38dbf82940e279c0"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.645080 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8cn5f" event={"ID":"e7bbf476-7ed6-40eb-ad7b-28ed2f1286fc","Type":"ContainerStarted","Data":"294234912ccb18089e081a8eb44655001cbf8704ecdc8defd7c197e1f2aad0a8"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.653157 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hjqhw" event={"ID":"74ac0fc1-580a-4095-a2aa-c4a5c1796ae0","Type":"ContainerStarted","Data":"a2cb87504cf0ee59dfe6ac963f63278eabc807645bc5270b5321793c6d610d62"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.655286 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-h4mp7" event={"ID":"f1854f8f-c17d-479c-b0f4-46159b9d0da8","Type":"ContainerStarted","Data":"d950624abff981cc2c1fac8009cef9954f03ef9ce2073253c30fd584c43ab2e0"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.655335 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-h4mp7" event={"ID":"f1854f8f-c17d-479c-b0f4-46159b9d0da8","Type":"ContainerStarted","Data":"834d760189b437ce98fdc0bb22040553b65227c206cf50d10628eaa6d045e2a0"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.655763 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:58 crc kubenswrapper[4925]: E0226 08:47:58.656017 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:59.155998336 +0000 UTC m=+211.295572619 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.656256 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:58 crc kubenswrapper[4925]: E0226 08:47:58.656773 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:59.156753234 +0000 UTC m=+211.296327507 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.659396 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-48gkw" event={"ID":"b412aae4-dae9-42ff-96e6-8e982f62eb11","Type":"ContainerStarted","Data":"c415cd5791a1c0cdbd40b9f1b6cc24d32aca0697323530f8be2ff33fc567641c"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.662002 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6tqj" event={"ID":"996aef4b-8460-499a-82ff-eb4fc29e8ca5","Type":"ContainerStarted","Data":"d502c479e7d663ea357d0695f81984fe3a3989b48245a9f4c8bbea0b8d6fd29e"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.662055 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6tqj" event={"ID":"996aef4b-8460-499a-82ff-eb4fc29e8ca5","Type":"ContainerStarted","Data":"b3de4840271aa74d7915d3e57089fbd23fc07a3426a3b2a70b3281ce90999473"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.666328 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm" event={"ID":"d3eefc5b-29c8-4016-83b9-f178e1de9a52","Type":"ContainerStarted","Data":"0af55fd41aca2616a51e51ab2d80f78e30461d690744b558b2916d0cdf00e121"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.666376 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm" event={"ID":"d3eefc5b-29c8-4016-83b9-f178e1de9a52","Type":"ContainerStarted","Data":"02956ebc35cc66b5152f34a325db0f56efd6b2f861403e86a0ef6da6cfe5da6c"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.673845 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm" Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.677638 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6mjmn" event={"ID":"5c4fd3e3-031e-4c99-96f1-a0a5c84e92e9","Type":"ContainerStarted","Data":"8e166535b096c593db8cf897d3eaa1116c5d7be89eeba2d686397377b799b11f"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.701387 4925 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6kdtm container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" start-of-body= Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.701485 4925 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm" podUID="d3eefc5b-29c8-4016-83b9-f178e1de9a52" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.740439 4925 generic.go:334] "Generic (PLEG): container finished" podID="37e33b34-0cc8-402b-b214-59e62bf03cfb" containerID="457b67f51bd6e2271e78b60fa397d25fc4f447b687d683d1d05986745a85aa10" exitCode=0 Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.740515 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" event={"ID":"37e33b34-0cc8-402b-b214-59e62bf03cfb","Type":"ContainerDied","Data":"457b67f51bd6e2271e78b60fa397d25fc4f447b687d683d1d05986745a85aa10"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.756941 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:58 crc kubenswrapper[4925]: E0226 08:47:58.758730 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:59.258715215 +0000 UTC m=+211.398289488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.770827 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gl96d" event={"ID":"9a7e81a4-3454-4467-8355-9a19f0274073","Type":"ContainerStarted","Data":"7788038b7f51064253d158de8837594fd045915b5f7aacbb589fadf651de39bf"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.770879 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gl96d" Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.786281 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nsvcl" event={"ID":"fa394c7c-78af-4696-8fc3-73e8301effd6","Type":"ContainerStarted","Data":"ee36deab91d215d109c0d37562071672d374909cb69a056e0d456b1d8b571eb6"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.790045 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k78cx" event={"ID":"26d0f330-c397-4f6a-8478-a706dc0c1af2","Type":"ContainerStarted","Data":"aae948f5bb995bc265cbb519f94196546af11ad51926ab64c74215d2ece4c33c"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.791712 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k78cx" Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.799034 4925 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-k78cx container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.799124 4925 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k78cx" podUID="26d0f330-c397-4f6a-8478-a706dc0c1af2" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.844484 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-t62dr" event={"ID":"b59f8920-d5ae-4e79-9923-75219a4e00fa","Type":"ContainerStarted","Data":"f2e29c6755c415c1416c5f81710828bc71e47d27291d9b88f8f7af6536b58462"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.846123 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-t62dr" Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.847916 4925 patch_prober.go:28] interesting pod/console-operator-58897d9998-t62dr container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.848002 4925 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-t62dr" podUID="b59f8920-d5ae-4e79-9923-75219a4e00fa" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.850579 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-djlxf" event={"ID":"3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e","Type":"ContainerStarted","Data":"15d382d03352f717ab210202981e18a171ddaa239d161cbf8620111f5053cde0"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.850645 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-djlxf" event={"ID":"3a9a7858-794a-472c-b6e7-9a0a3e8a4a9e","Type":"ContainerStarted","Data":"cebd4a45305fd1d592072202a40d117bd01bd55df9a31cfdf776edd8171f0cf4"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.858879 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:58 crc kubenswrapper[4925]: E0226 08:47:58.861935 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:59.361911655 +0000 UTC m=+211.501485938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.881403 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" event={"ID":"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b","Type":"ContainerStarted","Data":"85e37e47369d4075feae2baf947a8d974c887948f10f1a1960d056ff0e3e55b3"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.882001 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.890799 4925 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-8hj5v container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.890862 4925 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" podUID="ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.908041 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-h5g2f" event={"ID":"0fc3c69d-10be-4495-a469-8b975f26754f","Type":"ContainerStarted","Data":"b90ae5c96352a139dc951b30d7e919fe5032e497e210c0cd73124f0035a7bc26"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.924906 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-h5g2f" Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.926843 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-96njb" event={"ID":"c355d0b0-6cfe-4e12-b3a3-c08d389cf8dc","Type":"ContainerStarted","Data":"89ab74ea2adcd058c06091102c04251a012603e0581fb627300c98ba63d1c6df"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.932586 4925 patch_prober.go:28] interesting pod/router-default-5444994796-h5g2f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 08:47:58 crc kubenswrapper[4925]: [-]has-synced failed: reason withheld Feb 26 08:47:58 crc kubenswrapper[4925]: [+]process-running ok Feb 26 08:47:58 crc kubenswrapper[4925]: healthz check failed Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.932650 4925 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5g2f" podUID="0fc3c69d-10be-4495-a469-8b975f26754f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.959881 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:58 crc kubenswrapper[4925]: E0226 08:47:58.961209 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:59.46118993 +0000 UTC m=+211.600764213 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.966155 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bpqx7" event={"ID":"705f399b-a347-4aa1-a502-7b9720e67d71","Type":"ContainerStarted","Data":"b06908bca7d7f31378a4110ab0917b3d772d9c75a23829306be13d82569e1350"} Feb 26 08:47:58 crc kubenswrapper[4925]: I0226 08:47:58.978662 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29534925-7hrlx" event={"ID":"dd490e7f-4d64-4fc3-9c8d-17236968eafd","Type":"ContainerStarted","Data":"93bbcde44ba175d0d29d683fe4b9f55d4580b90216daebb7b5d608d5839ff755"} Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.056872 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f5g2n" event={"ID":"d90b9c6c-9c34-4d18-bda8-0bf6e482e71b","Type":"ContainerStarted","Data":"a037c5103e141be44dacaaf2f3056d0f87860a3e9dd3fce5825fba42096025a1"} Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.057261 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f5g2n" event={"ID":"d90b9c6c-9c34-4d18-bda8-0bf6e482e71b","Type":"ContainerStarted","Data":"4c7c9042c8d93bbad8c318b022157aa34bd52c38f017c3a05309a92c942451f9"} Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.073056 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:59 crc kubenswrapper[4925]: E0226 08:47:59.073508 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:59.573488393 +0000 UTC m=+211.713063016 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.120507 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" event={"ID":"c637160e-e623-415a-b0e4-9277dee94439","Type":"ContainerStarted","Data":"3eabaa6d5f3739ecb2b9f1bdb0150e374dd132c0efdaad7834205e78db9f8463"} Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.121174 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.173798 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdr9z" event={"ID":"3423b879-0617-45e9-a197-11d3c67141b7","Type":"ContainerStarted","Data":"c21e01e4e7c1b90ce6b670f023ce2bbaad396b31c0f0ac80cd454d58ec2ab6ff"} Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.175126 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:59 crc kubenswrapper[4925]: E0226 08:47:59.178176 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:59.67815983 +0000 UTC m=+211.817734103 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.251846 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mzkdl" event={"ID":"2b91e7c6-743f-4b13-a9e9-a59dd125eaaa","Type":"ContainerStarted","Data":"ee2aef5e91ae0fd8dbb6bb761baf98258904ca1c3aa4cc615707f2140cbd6708"} Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.263281 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4w7fl" event={"ID":"c8bdc47f-6d95-402a-95a7-2f61af7a65ee","Type":"ContainerStarted","Data":"7568db0e3c8e4d0b05085c4a199a187956a3701e2d61194482a7cd917c1f50f0"} Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.263398 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4w7fl" event={"ID":"c8bdc47f-6d95-402a-95a7-2f61af7a65ee","Type":"ContainerStarted","Data":"681e293a330e7d92a9db35f3451694d9fe8dffe7c8bcfac51e3467841e168740"} Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.279246 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:59 crc kubenswrapper[4925]: E0226 08:47:59.279543 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:59.779517345 +0000 UTC m=+211.919091628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.286364 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534926-k666h" event={"ID":"0e938a2d-afc6-4360-aa50-e93c71d2c8c3","Type":"ContainerStarted","Data":"9ae2843c7e040e3cb6df50c0f7846d0086e9f87cadee98ea0d95635d01d9e87c"} Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.314473 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4pmw2" event={"ID":"9cd6715c-c655-46b0-8dd1-b8bfcb253db4","Type":"ContainerStarted","Data":"0ecc6355dd2d06547908f155d1d069bd26b7a85e73fcded8ae1bf99588e48192"} Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.314517 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4pmw2" event={"ID":"9cd6715c-c655-46b0-8dd1-b8bfcb253db4","Type":"ContainerStarted","Data":"76174658da16307ad08ff2c939597bd2f98fe42bbfa799ba38a27ca9ff36fc61"} Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.335958 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chlnd" event={"ID":"5b52ddb4-6e78-4b20-bccf-0b3fab8fc71d","Type":"ContainerStarted","Data":"aa4a94767ee8dfff15f5660c062819dae94e61e0f319c10d3cf1575b1e72cde1"} Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.356229 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-c982t" event={"ID":"4bae7398-7181-4ce3-bc77-e1bafe05fe52","Type":"ContainerStarted","Data":"4780c8b8d277f4a335018c44229ad4f5418fc73671ab9f7560a26dd7cec395a0"} Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.356297 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-c982t" event={"ID":"4bae7398-7181-4ce3-bc77-e1bafe05fe52","Type":"ContainerStarted","Data":"5f0db320b42f742ac8f089031869ea042ef956ca30d5e923e649e9e40137e134"} Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.378041 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dd52h" event={"ID":"3fa6dc4c-4a71-427a-9a01-567e0d4c0325","Type":"ContainerStarted","Data":"c90a6fd7d9f842dcc1bcdc88b7cb27d1ac70eff3807bf7c614db564981cfa038"} Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.378124 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dd52h" event={"ID":"3fa6dc4c-4a71-427a-9a01-567e0d4c0325","Type":"ContainerStarted","Data":"92b2d9e67de0703fabe9fe8797b5e45559e0e82f7b50c25bd489500b3e8813ea"} Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.380536 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dd52h" Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.382058 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:59 crc kubenswrapper[4925]: E0226 08:47:59.384253 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:47:59.884227693 +0000 UTC m=+212.023801976 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.384295 4925 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dd52h container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.384373 4925 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dd52h" podUID="3fa6dc4c-4a71-427a-9a01-567e0d4c0325" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.403876 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pxx78" event={"ID":"6e131998-c782-4fda-bd5a-0fe0255c143d","Type":"ContainerStarted","Data":"b718154871ad72ba3c8c0eb57729f7da97e59b5404b01b0a4205cd699fa25990"} Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.403928 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pxx78" event={"ID":"6e131998-c782-4fda-bd5a-0fe0255c143d","Type":"ContainerStarted","Data":"0a84d6ca249042882513c1060fe8cfaf288e9eb9c28876e98c2e7cac933d79e2"} Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.408349 4925 patch_prober.go:28] interesting pod/downloads-7954f5f757-b5l7b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.408429 4925 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-b5l7b" podUID="cd116d5f-0452-44f0-ac17-2a0e2e083b93" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.455113 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-gdr9z" podStartSLOduration=163.455091264 podStartE2EDuration="2m43.455091264s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:59.451745222 +0000 UTC m=+211.591319505" watchObservedRunningTime="2026-02-26 08:47:59.455091264 +0000 UTC m=+211.594665547" Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.484272 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:59 crc kubenswrapper[4925]: E0226 08:47:59.485746 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:47:59.985727852 +0000 UTC m=+212.125302135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.508797 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dd52h" podStartSLOduration=163.508752915 podStartE2EDuration="2m43.508752915s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:59.481235892 +0000 UTC m=+211.620810185" watchObservedRunningTime="2026-02-26 08:47:59.508752915 +0000 UTC m=+211.648327198" Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.561589 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-f6tqj" podStartSLOduration=163.561554584 podStartE2EDuration="2m43.561554584s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:59.546881776 +0000 UTC m=+211.686456059" watchObservedRunningTime="2026-02-26 08:47:59.561554584 +0000 UTC m=+211.701128887" Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.587909 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:59 crc kubenswrapper[4925]: E0226 08:47:59.588294 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:48:00.088277867 +0000 UTC m=+212.227852150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.661859 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-chlnd" podStartSLOduration=164.661809083 podStartE2EDuration="2m44.661809083s" podCreationTimestamp="2026-02-26 08:45:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:59.587310233 +0000 UTC m=+211.726884516" watchObservedRunningTime="2026-02-26 08:47:59.661809083 +0000 UTC m=+211.801383356" Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.663100 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-pxx78" podStartSLOduration=163.663093964 podStartE2EDuration="2m43.663093964s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:59.660804408 +0000 UTC m=+211.800378711" watchObservedRunningTime="2026-02-26 08:47:59.663093964 +0000 UTC m=+211.802668247" Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.693616 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:59 crc kubenswrapper[4925]: E0226 08:47:59.694414 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:48:00.194387579 +0000 UTC m=+212.333961862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.765267 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" podStartSLOduration=164.765238749 podStartE2EDuration="2m44.765238749s" podCreationTimestamp="2026-02-26 08:45:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:59.763001015 +0000 UTC m=+211.902575308" watchObservedRunningTime="2026-02-26 08:47:59.765238749 +0000 UTC m=+211.904813032" Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.795038 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:47:59 crc kubenswrapper[4925]: E0226 08:47:59.795518 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:48:00.295499398 +0000 UTC m=+212.435073681 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.808580 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mzkdl" podStartSLOduration=164.808552347 podStartE2EDuration="2m44.808552347s" podCreationTimestamp="2026-02-26 08:45:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:59.793311895 +0000 UTC m=+211.932886178" watchObservedRunningTime="2026-02-26 08:47:59.808552347 +0000 UTC m=+211.948126630" Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.824083 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-bpqx7" podStartSLOduration=164.824066716 podStartE2EDuration="2m44.824066716s" podCreationTimestamp="2026-02-26 08:45:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:59.821135845 +0000 UTC m=+211.960710128" watchObservedRunningTime="2026-02-26 08:47:59.824066716 +0000 UTC m=+211.963640999" Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.896760 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-h5g2f" podStartSLOduration=163.896736711 podStartE2EDuration="2m43.896736711s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:59.892479637 +0000 UTC m=+212.032053940" watchObservedRunningTime="2026-02-26 08:47:59.896736711 +0000 UTC m=+212.036310994" Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.916255 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:47:59 crc kubenswrapper[4925]: E0226 08:47:59.916955 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:48:00.416931054 +0000 UTC m=+212.556505337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.951499 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gl96d" podStartSLOduration=163.951482448 podStartE2EDuration="2m43.951482448s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:59.950352561 +0000 UTC m=+212.089926864" watchObservedRunningTime="2026-02-26 08:47:59.951482448 +0000 UTC m=+212.091056731" Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.956314 4925 patch_prober.go:28] interesting pod/router-default-5444994796-h5g2f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 08:47:59 crc kubenswrapper[4925]: [-]has-synced failed: reason withheld Feb 26 08:47:59 crc kubenswrapper[4925]: [+]process-running ok Feb 26 08:47:59 crc kubenswrapper[4925]: healthz check failed Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.956366 4925 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5g2f" podUID="0fc3c69d-10be-4495-a469-8b975f26754f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.989804 4925 ???:1] "http: TLS handshake error from 192.168.126.11:49726: no serving certificate available for the kubelet" Feb 26 08:47:59 crc kubenswrapper[4925]: I0226 08:47:59.992447 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-48gkw" podStartSLOduration=163.992421828 podStartE2EDuration="2m43.992421828s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:47:59.977307269 +0000 UTC m=+212.116881552" watchObservedRunningTime="2026-02-26 08:47:59.992421828 +0000 UTC m=+212.131996111" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.005669 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-t62dr" podStartSLOduration=164.005647161 podStartE2EDuration="2m44.005647161s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:00.002701939 +0000 UTC m=+212.142276222" watchObservedRunningTime="2026-02-26 08:48:00.005647161 +0000 UTC m=+212.145221444" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.024222 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:48:00 crc kubenswrapper[4925]: E0226 08:48:00.024597 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:48:00.524576094 +0000 UTC m=+212.664150377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.088559 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-c982t" podStartSLOduration=164.088543426 podStartE2EDuration="2m44.088543426s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:00.080979441 +0000 UTC m=+212.220553724" watchObservedRunningTime="2026-02-26 08:48:00.088543426 +0000 UTC m=+212.228117719" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.122989 4925 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-btcxz container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.14:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.123055 4925 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" podUID="c637160e-e623-415a-b0e4-9277dee94439" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.14:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.127416 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:48:00 crc kubenswrapper[4925]: E0226 08:48:00.127883 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:48:00.627871607 +0000 UTC m=+212.767445890 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.155864 4925 ???:1] "http: TLS handshake error from 192.168.126.11:49732: no serving certificate available for the kubelet" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.156845 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm" podStartSLOduration=164.156827354 podStartE2EDuration="2m44.156827354s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:00.156202069 +0000 UTC m=+212.295776352" watchObservedRunningTime="2026-02-26 08:48:00.156827354 +0000 UTC m=+212.296401637" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.212045 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29534928-pxjft"] Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.212798 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534928-pxjft" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.228349 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.228678 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2qjw\" (UniqueName: \"kubernetes.io/projected/b2a81774-e811-40f9-a623-c50a6bcd5d7b-kube-api-access-b2qjw\") pod \"auto-csr-approver-29534928-pxjft\" (UID: \"b2a81774-e811-40f9-a623-c50a6bcd5d7b\") " pod="openshift-infra/auto-csr-approver-29534928-pxjft" Feb 26 08:48:00 crc kubenswrapper[4925]: E0226 08:48:00.228913 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:48:00.728891734 +0000 UTC m=+212.868466017 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.249105 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k78cx" podStartSLOduration=164.249088297 podStartE2EDuration="2m44.249088297s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:00.238654933 +0000 UTC m=+212.378229226" watchObservedRunningTime="2026-02-26 08:48:00.249088297 +0000 UTC m=+212.388662580" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.249902 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29534928-pxjft"] Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.304712 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29534925-7hrlx" podStartSLOduration=165.304688734 podStartE2EDuration="2m45.304688734s" podCreationTimestamp="2026-02-26 08:45:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:00.303510586 +0000 UTC m=+212.443084869" watchObservedRunningTime="2026-02-26 08:48:00.304688734 +0000 UTC m=+212.444263017" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.306223 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4w7fl" podStartSLOduration=164.306188721 podStartE2EDuration="2m44.306188721s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:00.269801723 +0000 UTC m=+212.409376016" watchObservedRunningTime="2026-02-26 08:48:00.306188721 +0000 UTC m=+212.445763004" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.321849 4925 ???:1] "http: TLS handshake error from 192.168.126.11:49740: no serving certificate available for the kubelet" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.329596 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2qjw\" (UniqueName: \"kubernetes.io/projected/b2a81774-e811-40f9-a623-c50a6bcd5d7b-kube-api-access-b2qjw\") pod \"auto-csr-approver-29534928-pxjft\" (UID: \"b2a81774-e811-40f9-a623-c50a6bcd5d7b\") " pod="openshift-infra/auto-csr-approver-29534928-pxjft" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.329657 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:48:00 crc kubenswrapper[4925]: E0226 08:48:00.329976 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:48:00.829961162 +0000 UTC m=+212.969535445 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.366156 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2qjw\" (UniqueName: \"kubernetes.io/projected/b2a81774-e811-40f9-a623-c50a6bcd5d7b-kube-api-access-b2qjw\") pod \"auto-csr-approver-29534928-pxjft\" (UID: \"b2a81774-e811-40f9-a623-c50a6bcd5d7b\") " pod="openshift-infra/auto-csr-approver-29534928-pxjft" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.373201 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-gl96d" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.430639 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:48:00 crc kubenswrapper[4925]: E0226 08:48:00.431402 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:48:00.931387849 +0000 UTC m=+213.070962132 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.433230 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k2dxr" event={"ID":"24e97113-beed-4628-94f1-bff371e2fecd","Type":"ContainerStarted","Data":"fdc56582ec815b2bf6ef3ae4747cfdddedd24d51ff26c8075692952fa289b168"} Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.433332 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k2dxr" event={"ID":"24e97113-beed-4628-94f1-bff371e2fecd","Type":"ContainerStarted","Data":"349320a4504d23f1091a587a9f2cc9a41b8baf315180e394ce3bd46014da4726"} Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.442942 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" podStartSLOduration=164.442925531 podStartE2EDuration="2m44.442925531s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:00.354392758 +0000 UTC m=+212.493967041" watchObservedRunningTime="2026-02-26 08:48:00.442925531 +0000 UTC m=+212.582499814" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.449747 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8cn5f" event={"ID":"e7bbf476-7ed6-40eb-ad7b-28ed2f1286fc","Type":"ContainerStarted","Data":"09b003a672cefcd4b2d8ed28a950cd824c44afb8f9b6ffac9298997ff2044308"} Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.466918 4925 ???:1] "http: TLS handshake error from 192.168.126.11:49744: no serving certificate available for the kubelet" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.470710 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hjqhw" event={"ID":"74ac0fc1-580a-4095-a2aa-c4a5c1796ae0","Type":"ContainerStarted","Data":"88a134a6df6af4c6d689b85703a0047102440262a14a230bc22ded42bebaea7b"} Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.470867 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hjqhw" event={"ID":"74ac0fc1-580a-4095-a2aa-c4a5c1796ae0","Type":"ContainerStarted","Data":"c918a6d29c8cd4aa5f77540a9f67817e044b8a60488d2a3608ed4ede9a4e5bff"} Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.471566 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-hjqhw" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.505749 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-48gkw" event={"ID":"b412aae4-dae9-42ff-96e6-8e982f62eb11","Type":"ContainerStarted","Data":"38f38dc3b2ed4b1dbc0315a108e6f36074b7918fa8d68407259adc2f09251b52"} Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.531429 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k78cx" event={"ID":"26d0f330-c397-4f6a-8478-a706dc0c1af2","Type":"ContainerStarted","Data":"d2ea7f48bc8d57f2d5de531d431dbf14cd5ce22765f6a9ab15db45ca51cabb43"} Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.531890 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-h4mp7" podStartSLOduration=164.531869893 podStartE2EDuration="2m44.531869893s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:00.45066214 +0000 UTC m=+212.590236423" watchObservedRunningTime="2026-02-26 08:48:00.531869893 +0000 UTC m=+212.671444176" Feb 26 08:48:00 crc kubenswrapper[4925]: E0226 08:48:00.532729 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:48:01.032712504 +0000 UTC m=+213.172286787 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.532408 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.534935 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534928-pxjft" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.545082 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" event={"ID":"37e33b34-0cc8-402b-b214-59e62bf03cfb","Type":"ContainerStarted","Data":"6a4708a63e6db33019f58b66be99d8502a4f904fed7227c7efda971f904ef11d"} Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.557975 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k78cx" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.558601 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29534925-7hrlx" event={"ID":"dd490e7f-4d64-4fc3-9c8d-17236968eafd","Type":"ContainerStarted","Data":"f6a099c69a05cb4a59d52a322427c532536eae08f27d966b52efe32a5d698f13"} Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.580169 4925 ???:1] "http: TLS handshake error from 192.168.126.11:49754: no serving certificate available for the kubelet" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.580152 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-r6b4b" podStartSLOduration=8.580129432 podStartE2EDuration="8.580129432s" podCreationTimestamp="2026-02-26 08:47:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:00.542007581 +0000 UTC m=+212.681581864" watchObservedRunningTime="2026-02-26 08:48:00.580129432 +0000 UTC m=+212.719703715" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.596519 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" event={"ID":"cd55dbfe-4f78-44d1-85a2-7871197c0874","Type":"ContainerStarted","Data":"0f54c34d40f5d9ee26fda7c3dfb243c83d679704c41640e224a14c7ecc0ab855"} Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.635915 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:48:00 crc kubenswrapper[4925]: E0226 08:48:00.637310 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:48:01.137284928 +0000 UTC m=+213.276859211 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.644782 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-96njb" podStartSLOduration=164.644759381 podStartE2EDuration="2m44.644759381s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:00.6422897 +0000 UTC m=+212.781863993" watchObservedRunningTime="2026-02-26 08:48:00.644759381 +0000 UTC m=+212.784333664" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.645035 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6mjmn" podStartSLOduration=164.645028967 podStartE2EDuration="2m44.645028967s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:00.579034685 +0000 UTC m=+212.718608978" watchObservedRunningTime="2026-02-26 08:48:00.645028967 +0000 UTC m=+212.784603250" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.669722 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" event={"ID":"66d0db3d-0908-48d2-be73-e1afd790f010","Type":"ContainerStarted","Data":"d68551da87196533ac33ca4c2df72026dd38bdcaf3cba15dade8c32e54837764"} Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.685110 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nsvcl" event={"ID":"fa394c7c-78af-4696-8fc3-73e8301effd6","Type":"ContainerStarted","Data":"00a7c6d94efbfdf38b78b911ac71f31fcc8765f4cc71267c45467cb239f87e52"} Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.696691 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4pmw2" podStartSLOduration=164.696674189 podStartE2EDuration="2m44.696674189s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:00.693964473 +0000 UTC m=+212.833538776" watchObservedRunningTime="2026-02-26 08:48:00.696674189 +0000 UTC m=+212.836248462" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.715788 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fgfsf" event={"ID":"0af8dc28-2edf-4a81-bcb9-a5d54ccf8174","Type":"ContainerStarted","Data":"0633a4dfd608d4ff60ec7af771a240ba6e619fd48d291c87d26f76ee897ee644"} Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.715856 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fgfsf" event={"ID":"0af8dc28-2edf-4a81-bcb9-a5d54ccf8174","Type":"ContainerStarted","Data":"ab66d190dd6d71bea76a26edce145fd83db95a582dbdce95b2486e4dfbdf1470"} Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.716734 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fgfsf" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.737204 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.750046 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tv47r" event={"ID":"95ec09f6-783a-4661-9312-2d8f96a33825","Type":"ContainerStarted","Data":"ef7e57d85ba03189f939526c473c1965ab672fbaa9dfd43d882784d6e10f11ff"} Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.751991 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-djlxf" podStartSLOduration=164.75198101 podStartE2EDuration="2m44.75198101s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:00.751363134 +0000 UTC m=+212.890937417" watchObservedRunningTime="2026-02-26 08:48:00.75198101 +0000 UTC m=+212.891555303" Feb 26 08:48:00 crc kubenswrapper[4925]: E0226 08:48:00.769382 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:48:01.269352194 +0000 UTC m=+213.408926477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.771899 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4pmw2" event={"ID":"9cd6715c-c655-46b0-8dd1-b8bfcb253db4","Type":"ContainerStarted","Data":"6603a2b39e1a02d0d93532d5cfab7fd32b4c30889582c2f36847a1571b98d97b"} Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.785204 4925 ???:1] "http: TLS handshake error from 192.168.126.11:49762: no serving certificate available for the kubelet" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.792026 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-t62dr" event={"ID":"b59f8920-d5ae-4e79-9923-75219a4e00fa","Type":"ContainerStarted","Data":"3eaaf2fcb9cd636a3b25f7486f6e5e76ca3d994cd114f04459844a2acd078de3"} Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.793221 4925 patch_prober.go:28] interesting pod/console-operator-58897d9998-t62dr container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.793265 4925 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-t62dr" podUID="b59f8920-d5ae-4e79-9923-75219a4e00fa" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.832309 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fgfsf" podStartSLOduration=164.832284181 podStartE2EDuration="2m44.832284181s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:00.830863576 +0000 UTC m=+212.970437859" watchObservedRunningTime="2026-02-26 08:48:00.832284181 +0000 UTC m=+212.971858464" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.849195 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:48:00 crc kubenswrapper[4925]: E0226 08:48:00.849403 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:48:01.349382049 +0000 UTC m=+213.488956332 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.849755 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:48:00 crc kubenswrapper[4925]: E0226 08:48:00.852273 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:48:01.352264479 +0000 UTC m=+213.491838762 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.855893 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f5g2n" event={"ID":"d90b9c6c-9c34-4d18-bda8-0bf6e482e71b","Type":"ContainerStarted","Data":"8f752911bb904140837333455e68d45833a88f79c025817fc0b64c8fb16a9ce1"} Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.858000 4925 patch_prober.go:28] interesting pod/downloads-7954f5f757-b5l7b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.858104 4925 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-b5l7b" podUID="cd116d5f-0452-44f0-ac17-2a0e2e083b93" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.874062 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.889757 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.899310 4925 ???:1] "http: TLS handshake error from 192.168.126.11:49766: no serving certificate available for the kubelet" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.909952 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dd52h" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.939020 4925 patch_prober.go:28] interesting pod/router-default-5444994796-h5g2f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 08:48:00 crc kubenswrapper[4925]: [-]has-synced failed: reason withheld Feb 26 08:48:00 crc kubenswrapper[4925]: [+]process-running ok Feb 26 08:48:00 crc kubenswrapper[4925]: healthz check failed Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.939101 4925 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5g2f" podUID="0fc3c69d-10be-4495-a469-8b975f26754f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.958052 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:48:00 crc kubenswrapper[4925]: E0226 08:48:00.958198 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:48:01.458173836 +0000 UTC m=+213.597748119 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:00 crc kubenswrapper[4925]: I0226 08:48:00.959067 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:48:00 crc kubenswrapper[4925]: E0226 08:48:00.973187 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:48:01.473168812 +0000 UTC m=+213.612743095 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.006659 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hjqhw" podStartSLOduration=9.006627979 podStartE2EDuration="9.006627979s" podCreationTimestamp="2026-02-26 08:47:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:01.000232033 +0000 UTC m=+213.139806336" watchObservedRunningTime="2026-02-26 08:48:01.006627979 +0000 UTC m=+213.146202262" Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.060313 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:48:01 crc kubenswrapper[4925]: E0226 08:48:01.060613 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:48:01.560597498 +0000 UTC m=+213.700171781 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.073260 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" podStartSLOduration=165.073244386 podStartE2EDuration="2m45.073244386s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:01.039934573 +0000 UTC m=+213.179508876" watchObservedRunningTime="2026-02-26 08:48:01.073244386 +0000 UTC m=+213.212818669" Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.129486 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-nsvcl" podStartSLOduration=165.12946112 podStartE2EDuration="2m45.12946112s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:01.076715071 +0000 UTC m=+213.216289354" watchObservedRunningTime="2026-02-26 08:48:01.12946112 +0000 UTC m=+213.269035393" Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.143293 4925 ???:1] "http: TLS handshake error from 192.168.126.11:49780: no serving certificate available for the kubelet" Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.162014 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:48:01 crc kubenswrapper[4925]: E0226 08:48:01.162310 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:48:01.662299282 +0000 UTC m=+213.801873565 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.237616 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-k2dxr" podStartSLOduration=165.237598831 podStartE2EDuration="2m45.237598831s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:01.195267847 +0000 UTC m=+213.334842140" watchObservedRunningTime="2026-02-26 08:48:01.237598831 +0000 UTC m=+213.377173114" Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.262450 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8cn5f" podStartSLOduration=165.262434217 podStartE2EDuration="2m45.262434217s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:01.239390575 +0000 UTC m=+213.378964858" watchObservedRunningTime="2026-02-26 08:48:01.262434217 +0000 UTC m=+213.402008500" Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.263041 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.263623 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.264676 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:48:01 crc kubenswrapper[4925]: E0226 08:48:01.265033 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:48:01.76501825 +0000 UTC m=+213.904592534 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.272175 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.283735 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.283979 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.300923 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.367569 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8136d8e6-24c6-404b-9c09-13ba16c0fd86-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8136d8e6-24c6-404b-9c09-13ba16c0fd86\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.367622 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8136d8e6-24c6-404b-9c09-13ba16c0fd86-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8136d8e6-24c6-404b-9c09-13ba16c0fd86\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.367677 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:48:01 crc kubenswrapper[4925]: E0226 08:48:01.367936 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:48:01.867925194 +0000 UTC m=+214.007499477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.401481 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" podStartSLOduration=166.401465863 podStartE2EDuration="2m46.401465863s" podCreationTimestamp="2026-02-26 08:45:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:01.390204148 +0000 UTC m=+213.529778431" watchObservedRunningTime="2026-02-26 08:48:01.401465863 +0000 UTC m=+213.541040146" Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.426558 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-tv47r" podStartSLOduration=165.426541526 podStartE2EDuration="2m45.426541526s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:01.424594528 +0000 UTC m=+213.564168821" watchObservedRunningTime="2026-02-26 08:48:01.426541526 +0000 UTC m=+213.566115809" Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.473951 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:48:01 crc kubenswrapper[4925]: E0226 08:48:01.474050 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:48:01.974034196 +0000 UTC m=+214.113608469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.474282 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8136d8e6-24c6-404b-9c09-13ba16c0fd86-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8136d8e6-24c6-404b-9c09-13ba16c0fd86\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.474340 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.474377 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8136d8e6-24c6-404b-9c09-13ba16c0fd86-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8136d8e6-24c6-404b-9c09-13ba16c0fd86\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.474686 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8136d8e6-24c6-404b-9c09-13ba16c0fd86-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8136d8e6-24c6-404b-9c09-13ba16c0fd86\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 08:48:01 crc kubenswrapper[4925]: E0226 08:48:01.474917 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:48:01.974909827 +0000 UTC m=+214.114484110 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.564767 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8136d8e6-24c6-404b-9c09-13ba16c0fd86-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8136d8e6-24c6-404b-9c09-13ba16c0fd86\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.578038 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:48:01 crc kubenswrapper[4925]: E0226 08:48:01.578297 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:48:02.078280402 +0000 UTC m=+214.217854685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.648948 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.680262 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:48:01 crc kubenswrapper[4925]: E0226 08:48:01.680686 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:48:02.180673953 +0000 UTC m=+214.320248236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.781492 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:48:01 crc kubenswrapper[4925]: E0226 08:48:01.781839 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:48:02.281823414 +0000 UTC m=+214.421397697 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.861561 4925 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6kdtm container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.861623 4925 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm" podUID="d3eefc5b-29c8-4016-83b9-f178e1de9a52" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.41:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.868581 4925 generic.go:334] "Generic (PLEG): container finished" podID="dd490e7f-4d64-4fc3-9c8d-17236968eafd" containerID="f6a099c69a05cb4a59d52a322427c532536eae08f27d966b52efe32a5d698f13" exitCode=0 Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.869837 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29534925-7hrlx" event={"ID":"dd490e7f-4d64-4fc3-9c8d-17236968eafd","Type":"ContainerDied","Data":"f6a099c69a05cb4a59d52a322427c532536eae08f27d966b52efe32a5d698f13"} Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.885298 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:48:01 crc kubenswrapper[4925]: E0226 08:48:01.885737 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:48:02.385722241 +0000 UTC m=+214.525296524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.899726 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-f5g2n" podStartSLOduration=165.899706943 podStartE2EDuration="2m45.899706943s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:01.840729982 +0000 UTC m=+213.980304265" watchObservedRunningTime="2026-02-26 08:48:01.899706943 +0000 UTC m=+214.039281226" Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.902300 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29534928-pxjft"] Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.930103 4925 patch_prober.go:28] interesting pod/router-default-5444994796-h5g2f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 08:48:01 crc kubenswrapper[4925]: [-]has-synced failed: reason withheld Feb 26 08:48:01 crc kubenswrapper[4925]: [+]process-running ok Feb 26 08:48:01 crc kubenswrapper[4925]: healthz check failed Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.930159 4925 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5g2f" podUID="0fc3c69d-10be-4495-a469-8b975f26754f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 08:48:01 crc kubenswrapper[4925]: I0226 08:48:01.986437 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:48:01 crc kubenswrapper[4925]: E0226 08:48:01.988757 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:48:02.488719947 +0000 UTC m=+214.628294240 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.015669 4925 ???:1] "http: TLS handshake error from 192.168.126.11:49794: no serving certificate available for the kubelet" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.090810 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:48:02 crc kubenswrapper[4925]: E0226 08:48:02.091163 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:48:02.591150419 +0000 UTC m=+214.730724702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.193826 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:48:02 crc kubenswrapper[4925]: E0226 08:48:02.194175 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:48:02.694159825 +0000 UTC m=+214.833734108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.278541 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.295060 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:48:02 crc kubenswrapper[4925]: E0226 08:48:02.295408 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:48:02.795393688 +0000 UTC m=+214.934967981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.396369 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:48:02 crc kubenswrapper[4925]: E0226 08:48:02.396728 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:48:02.896712202 +0000 UTC m=+215.036286485 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.495683 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-t62dr" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.498949 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:48:02 crc kubenswrapper[4925]: E0226 08:48:02.499322 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:48:02.999301898 +0000 UTC m=+215.138876181 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.507005 4925 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.600800 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:48:02 crc kubenswrapper[4925]: E0226 08:48:02.602050 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:48:03.102031317 +0000 UTC m=+215.241605600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.609910 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xgq9g"] Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.611129 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgq9g" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.616434 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.653795 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdtm" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.672164 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xgq9g"] Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.705219 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czrnm\" (UniqueName: \"kubernetes.io/projected/b8aa32c8-5b82-46ac-a7e2-7d6399474aff-kube-api-access-czrnm\") pod \"community-operators-xgq9g\" (UID: \"b8aa32c8-5b82-46ac-a7e2-7d6399474aff\") " pod="openshift-marketplace/community-operators-xgq9g" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.705264 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8aa32c8-5b82-46ac-a7e2-7d6399474aff-utilities\") pod \"community-operators-xgq9g\" (UID: \"b8aa32c8-5b82-46ac-a7e2-7d6399474aff\") " pod="openshift-marketplace/community-operators-xgq9g" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.705306 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8aa32c8-5b82-46ac-a7e2-7d6399474aff-catalog-content\") pod \"community-operators-xgq9g\" (UID: \"b8aa32c8-5b82-46ac-a7e2-7d6399474aff\") " pod="openshift-marketplace/community-operators-xgq9g" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.705365 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:48:02 crc kubenswrapper[4925]: E0226 08:48:02.705732 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 08:48:03.20571501 +0000 UTC m=+215.345289293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-rrzqs" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.766720 4925 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-26T08:48:02.507032057Z","Handler":null,"Name":""} Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.785402 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x9xkw"] Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.786374 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x9xkw" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.811473 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.812082 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.812232 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a8eea64-29d5-46bc-89db-23ad451eba4f-catalog-content\") pod \"certified-operators-x9xkw\" (UID: \"5a8eea64-29d5-46bc-89db-23ad451eba4f\") " pod="openshift-marketplace/certified-operators-x9xkw" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.812256 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czrnm\" (UniqueName: \"kubernetes.io/projected/b8aa32c8-5b82-46ac-a7e2-7d6399474aff-kube-api-access-czrnm\") pod \"community-operators-xgq9g\" (UID: \"b8aa32c8-5b82-46ac-a7e2-7d6399474aff\") " pod="openshift-marketplace/community-operators-xgq9g" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.812277 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6tb5\" (UniqueName: \"kubernetes.io/projected/5a8eea64-29d5-46bc-89db-23ad451eba4f-kube-api-access-j6tb5\") pod \"certified-operators-x9xkw\" (UID: \"5a8eea64-29d5-46bc-89db-23ad451eba4f\") " pod="openshift-marketplace/certified-operators-x9xkw" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.812294 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8aa32c8-5b82-46ac-a7e2-7d6399474aff-utilities\") pod \"community-operators-xgq9g\" (UID: \"b8aa32c8-5b82-46ac-a7e2-7d6399474aff\") " pod="openshift-marketplace/community-operators-xgq9g" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.812326 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8aa32c8-5b82-46ac-a7e2-7d6399474aff-catalog-content\") pod \"community-operators-xgq9g\" (UID: \"b8aa32c8-5b82-46ac-a7e2-7d6399474aff\") " pod="openshift-marketplace/community-operators-xgq9g" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.812347 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a8eea64-29d5-46bc-89db-23ad451eba4f-utilities\") pod \"certified-operators-x9xkw\" (UID: \"5a8eea64-29d5-46bc-89db-23ad451eba4f\") " pod="openshift-marketplace/certified-operators-x9xkw" Feb 26 08:48:02 crc kubenswrapper[4925]: E0226 08:48:02.812437 4925 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 08:48:03.312421786 +0000 UTC m=+215.451996059 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.813021 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8aa32c8-5b82-46ac-a7e2-7d6399474aff-utilities\") pod \"community-operators-xgq9g\" (UID: \"b8aa32c8-5b82-46ac-a7e2-7d6399474aff\") " pod="openshift-marketplace/community-operators-xgq9g" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.813250 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8aa32c8-5b82-46ac-a7e2-7d6399474aff-catalog-content\") pod \"community-operators-xgq9g\" (UID: \"b8aa32c8-5b82-46ac-a7e2-7d6399474aff\") " pod="openshift-marketplace/community-operators-xgq9g" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.813861 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x9xkw"] Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.830412 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8hj5v"] Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.866176 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czrnm\" (UniqueName: \"kubernetes.io/projected/b8aa32c8-5b82-46ac-a7e2-7d6399474aff-kube-api-access-czrnm\") pod \"community-operators-xgq9g\" (UID: \"b8aa32c8-5b82-46ac-a7e2-7d6399474aff\") " pod="openshift-marketplace/community-operators-xgq9g" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.868545 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb"] Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.868802 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" podUID="60dcb438-eb57-425d-a9a0-c1dfe9a884e2" containerName="route-controller-manager" containerID="cri-o://8e2e48cacc19942d2679edccd58011931af997a132f38d7c7f6c53c99d077de2" gracePeriod=30 Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.884959 4925 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.885023 4925 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.901056 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8136d8e6-24c6-404b-9c09-13ba16c0fd86","Type":"ContainerStarted","Data":"6220b5cb7e2a59a4ad7ef20736f031beed8be9da188a9b6fc9bab5a8579c5d9a"} Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.915934 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534928-pxjft" event={"ID":"b2a81774-e811-40f9-a623-c50a6bcd5d7b","Type":"ContainerStarted","Data":"9537c95d197a961ce5ec196a6f7fbaf435445ff318995a57c283825098fba88f"} Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.916045 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a8eea64-29d5-46bc-89db-23ad451eba4f-utilities\") pod \"certified-operators-x9xkw\" (UID: \"5a8eea64-29d5-46bc-89db-23ad451eba4f\") " pod="openshift-marketplace/certified-operators-x9xkw" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.916090 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.916138 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a8eea64-29d5-46bc-89db-23ad451eba4f-catalog-content\") pod \"certified-operators-x9xkw\" (UID: \"5a8eea64-29d5-46bc-89db-23ad451eba4f\") " pod="openshift-marketplace/certified-operators-x9xkw" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.916162 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6tb5\" (UniqueName: \"kubernetes.io/projected/5a8eea64-29d5-46bc-89db-23ad451eba4f-kube-api-access-j6tb5\") pod \"certified-operators-x9xkw\" (UID: \"5a8eea64-29d5-46bc-89db-23ad451eba4f\") " pod="openshift-marketplace/certified-operators-x9xkw" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.916779 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a8eea64-29d5-46bc-89db-23ad451eba4f-utilities\") pod \"certified-operators-x9xkw\" (UID: \"5a8eea64-29d5-46bc-89db-23ad451eba4f\") " pod="openshift-marketplace/certified-operators-x9xkw" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.917023 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a8eea64-29d5-46bc-89db-23ad451eba4f-catalog-content\") pod \"certified-operators-x9xkw\" (UID: \"5a8eea64-29d5-46bc-89db-23ad451eba4f\") " pod="openshift-marketplace/certified-operators-x9xkw" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.922981 4925 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.923046 4925 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.926704 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" podUID="ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b" containerName="controller-manager" containerID="cri-o://85e37e47369d4075feae2baf947a8d974c887948f10f1a1960d056ff0e3e55b3" gracePeriod=30 Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.926996 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" event={"ID":"cd55dbfe-4f78-44d1-85a2-7871197c0874","Type":"ContainerStarted","Data":"4d2fc5e0fda40921ed1b4f38b0eb881bd3df4fc25710dc58b4d15f6e85e8461a"} Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.929273 4925 patch_prober.go:28] interesting pod/router-default-5444994796-h5g2f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 08:48:02 crc kubenswrapper[4925]: [-]has-synced failed: reason withheld Feb 26 08:48:02 crc kubenswrapper[4925]: [+]process-running ok Feb 26 08:48:02 crc kubenswrapper[4925]: healthz check failed Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.929304 4925 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5g2f" podUID="0fc3c69d-10be-4495-a469-8b975f26754f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.964891 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgq9g" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.967510 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6tb5\" (UniqueName: \"kubernetes.io/projected/5a8eea64-29d5-46bc-89db-23ad451eba4f-kube-api-access-j6tb5\") pod \"certified-operators-x9xkw\" (UID: \"5a8eea64-29d5-46bc-89db-23ad451eba4f\") " pod="openshift-marketplace/certified-operators-x9xkw" Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.986700 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gdzrh"] Feb 26 08:48:02 crc kubenswrapper[4925]: I0226 08:48:02.987946 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gdzrh" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.017245 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78d95f85-01b1-4a6b-8c48-b477f840035d-utilities\") pod \"community-operators-gdzrh\" (UID: \"78d95f85-01b1-4a6b-8c48-b477f840035d\") " pod="openshift-marketplace/community-operators-gdzrh" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.017393 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78d95f85-01b1-4a6b-8c48-b477f840035d-catalog-content\") pod \"community-operators-gdzrh\" (UID: \"78d95f85-01b1-4a6b-8c48-b477f840035d\") " pod="openshift-marketplace/community-operators-gdzrh" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.017736 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj2zh\" (UniqueName: \"kubernetes.io/projected/78d95f85-01b1-4a6b-8c48-b477f840035d-kube-api-access-cj2zh\") pod \"community-operators-gdzrh\" (UID: \"78d95f85-01b1-4a6b-8c48-b477f840035d\") " pod="openshift-marketplace/community-operators-gdzrh" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.034387 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gdzrh"] Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.118977 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj2zh\" (UniqueName: \"kubernetes.io/projected/78d95f85-01b1-4a6b-8c48-b477f840035d-kube-api-access-cj2zh\") pod \"community-operators-gdzrh\" (UID: \"78d95f85-01b1-4a6b-8c48-b477f840035d\") " pod="openshift-marketplace/community-operators-gdzrh" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.119413 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78d95f85-01b1-4a6b-8c48-b477f840035d-utilities\") pod \"community-operators-gdzrh\" (UID: \"78d95f85-01b1-4a6b-8c48-b477f840035d\") " pod="openshift-marketplace/community-operators-gdzrh" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.119449 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78d95f85-01b1-4a6b-8c48-b477f840035d-catalog-content\") pod \"community-operators-gdzrh\" (UID: \"78d95f85-01b1-4a6b-8c48-b477f840035d\") " pod="openshift-marketplace/community-operators-gdzrh" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.119783 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78d95f85-01b1-4a6b-8c48-b477f840035d-catalog-content\") pod \"community-operators-gdzrh\" (UID: \"78d95f85-01b1-4a6b-8c48-b477f840035d\") " pod="openshift-marketplace/community-operators-gdzrh" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.120219 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78d95f85-01b1-4a6b-8c48-b477f840035d-utilities\") pod \"community-operators-gdzrh\" (UID: \"78d95f85-01b1-4a6b-8c48-b477f840035d\") " pod="openshift-marketplace/community-operators-gdzrh" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.148491 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x9xkw" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.168623 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj2zh\" (UniqueName: \"kubernetes.io/projected/78d95f85-01b1-4a6b-8c48-b477f840035d-kube-api-access-cj2zh\") pod \"community-operators-gdzrh\" (UID: \"78d95f85-01b1-4a6b-8c48-b477f840035d\") " pod="openshift-marketplace/community-operators-gdzrh" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.181012 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-rrzqs\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.224308 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.226109 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c8g65"] Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.227138 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8g65" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.229919 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c8g65"] Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.305961 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.318935 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.325637 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n72qr\" (UniqueName: \"kubernetes.io/projected/cf852adc-ded1-43ba-9ed4-cf31d6eda903-kube-api-access-n72qr\") pod \"certified-operators-c8g65\" (UID: \"cf852adc-ded1-43ba-9ed4-cf31d6eda903\") " pod="openshift-marketplace/certified-operators-c8g65" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.325714 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf852adc-ded1-43ba-9ed4-cf31d6eda903-catalog-content\") pod \"certified-operators-c8g65\" (UID: \"cf852adc-ded1-43ba-9ed4-cf31d6eda903\") " pod="openshift-marketplace/certified-operators-c8g65" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.325759 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf852adc-ded1-43ba-9ed4-cf31d6eda903-utilities\") pod \"certified-operators-c8g65\" (UID: \"cf852adc-ded1-43ba-9ed4-cf31d6eda903\") " pod="openshift-marketplace/certified-operators-c8g65" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.376448 4925 ???:1] "http: TLS handshake error from 192.168.126.11:41768: no serving certificate available for the kubelet" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.418472 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gdzrh" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.427273 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf852adc-ded1-43ba-9ed4-cf31d6eda903-catalog-content\") pod \"certified-operators-c8g65\" (UID: \"cf852adc-ded1-43ba-9ed4-cf31d6eda903\") " pod="openshift-marketplace/certified-operators-c8g65" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.427341 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf852adc-ded1-43ba-9ed4-cf31d6eda903-utilities\") pod \"certified-operators-c8g65\" (UID: \"cf852adc-ded1-43ba-9ed4-cf31d6eda903\") " pod="openshift-marketplace/certified-operators-c8g65" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.427434 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n72qr\" (UniqueName: \"kubernetes.io/projected/cf852adc-ded1-43ba-9ed4-cf31d6eda903-kube-api-access-n72qr\") pod \"certified-operators-c8g65\" (UID: \"cf852adc-ded1-43ba-9ed4-cf31d6eda903\") " pod="openshift-marketplace/certified-operators-c8g65" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.432056 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf852adc-ded1-43ba-9ed4-cf31d6eda903-catalog-content\") pod \"certified-operators-c8g65\" (UID: \"cf852adc-ded1-43ba-9ed4-cf31d6eda903\") " pod="openshift-marketplace/certified-operators-c8g65" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.432080 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf852adc-ded1-43ba-9ed4-cf31d6eda903-utilities\") pod \"certified-operators-c8g65\" (UID: \"cf852adc-ded1-43ba-9ed4-cf31d6eda903\") " pod="openshift-marketplace/certified-operators-c8g65" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.460395 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n72qr\" (UniqueName: \"kubernetes.io/projected/cf852adc-ded1-43ba-9ed4-cf31d6eda903-kube-api-access-n72qr\") pod \"certified-operators-c8g65\" (UID: \"cf852adc-ded1-43ba-9ed4-cf31d6eda903\") " pod="openshift-marketplace/certified-operators-c8g65" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.495947 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29534925-7hrlx" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.528435 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd490e7f-4d64-4fc3-9c8d-17236968eafd-config-volume\") pod \"dd490e7f-4d64-4fc3-9c8d-17236968eafd\" (UID: \"dd490e7f-4d64-4fc3-9c8d-17236968eafd\") " Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.528580 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6n9c\" (UniqueName: \"kubernetes.io/projected/dd490e7f-4d64-4fc3-9c8d-17236968eafd-kube-api-access-s6n9c\") pod \"dd490e7f-4d64-4fc3-9c8d-17236968eafd\" (UID: \"dd490e7f-4d64-4fc3-9c8d-17236968eafd\") " Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.528616 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd490e7f-4d64-4fc3-9c8d-17236968eafd-secret-volume\") pod \"dd490e7f-4d64-4fc3-9c8d-17236968eafd\" (UID: \"dd490e7f-4d64-4fc3-9c8d-17236968eafd\") " Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.529757 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd490e7f-4d64-4fc3-9c8d-17236968eafd-config-volume" (OuterVolumeSpecName: "config-volume") pod "dd490e7f-4d64-4fc3-9c8d-17236968eafd" (UID: "dd490e7f-4d64-4fc3-9c8d-17236968eafd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.536394 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd490e7f-4d64-4fc3-9c8d-17236968eafd-kube-api-access-s6n9c" (OuterVolumeSpecName: "kube-api-access-s6n9c") pod "dd490e7f-4d64-4fc3-9c8d-17236968eafd" (UID: "dd490e7f-4d64-4fc3-9c8d-17236968eafd"). InnerVolumeSpecName "kube-api-access-s6n9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.536624 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd490e7f-4d64-4fc3-9c8d-17236968eafd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dd490e7f-4d64-4fc3-9c8d-17236968eafd" (UID: "dd490e7f-4d64-4fc3-9c8d-17236968eafd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.606501 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8g65" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.642281 4925 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd490e7f-4d64-4fc3-9c8d-17236968eafd-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.642394 4925 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd490e7f-4d64-4fc3-9c8d-17236968eafd-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.642405 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6n9c\" (UniqueName: \"kubernetes.io/projected/dd490e7f-4d64-4fc3-9c8d-17236968eafd-kube-api-access-s6n9c\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.643218 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.709759 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xgq9g"] Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.745017 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-serving-cert\") pod \"60dcb438-eb57-425d-a9a0-c1dfe9a884e2\" (UID: \"60dcb438-eb57-425d-a9a0-c1dfe9a884e2\") " Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.745080 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwdzt\" (UniqueName: \"kubernetes.io/projected/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-kube-api-access-mwdzt\") pod \"60dcb438-eb57-425d-a9a0-c1dfe9a884e2\" (UID: \"60dcb438-eb57-425d-a9a0-c1dfe9a884e2\") " Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.745123 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-config\") pod \"60dcb438-eb57-425d-a9a0-c1dfe9a884e2\" (UID: \"60dcb438-eb57-425d-a9a0-c1dfe9a884e2\") " Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.745160 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-client-ca\") pod \"60dcb438-eb57-425d-a9a0-c1dfe9a884e2\" (UID: \"60dcb438-eb57-425d-a9a0-c1dfe9a884e2\") " Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.746003 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-client-ca" (OuterVolumeSpecName: "client-ca") pod "60dcb438-eb57-425d-a9a0-c1dfe9a884e2" (UID: "60dcb438-eb57-425d-a9a0-c1dfe9a884e2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.747782 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-config" (OuterVolumeSpecName: "config") pod "60dcb438-eb57-425d-a9a0-c1dfe9a884e2" (UID: "60dcb438-eb57-425d-a9a0-c1dfe9a884e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.753204 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-kube-api-access-mwdzt" (OuterVolumeSpecName: "kube-api-access-mwdzt") pod "60dcb438-eb57-425d-a9a0-c1dfe9a884e2" (UID: "60dcb438-eb57-425d-a9a0-c1dfe9a884e2"). InnerVolumeSpecName "kube-api-access-mwdzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.755442 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "60dcb438-eb57-425d-a9a0-c1dfe9a884e2" (UID: "60dcb438-eb57-425d-a9a0-c1dfe9a884e2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.774052 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.845964 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-config\") pod \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\" (UID: \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\") " Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.846445 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-serving-cert\") pod \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\" (UID: \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\") " Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.847375 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-client-ca\") pod \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\" (UID: \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\") " Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.847455 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57w97\" (UniqueName: \"kubernetes.io/projected/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-kube-api-access-57w97\") pod \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\" (UID: \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\") " Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.847480 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-proxy-ca-bundles\") pod \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\" (UID: \"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b\") " Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.847869 4925 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.847931 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.847943 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwdzt\" (UniqueName: \"kubernetes.io/projected/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-kube-api-access-mwdzt\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.847953 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60dcb438-eb57-425d-a9a0-c1dfe9a884e2-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.847966 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-client-ca" (OuterVolumeSpecName: "client-ca") pod "ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b" (UID: "ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.848479 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-config" (OuterVolumeSpecName: "config") pod "ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b" (UID: "ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.848726 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b" (UID: "ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.860100 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-kube-api-access-57w97" (OuterVolumeSpecName: "kube-api-access-57w97") pod "ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b" (UID: "ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b"). InnerVolumeSpecName "kube-api-access-57w97". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.868466 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b" (UID: "ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.915064 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rrzqs"] Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.926940 4925 patch_prober.go:28] interesting pod/router-default-5444994796-h5g2f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 08:48:03 crc kubenswrapper[4925]: [-]has-synced failed: reason withheld Feb 26 08:48:03 crc kubenswrapper[4925]: [+]process-running ok Feb 26 08:48:03 crc kubenswrapper[4925]: healthz check failed Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.927328 4925 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5g2f" podUID="0fc3c69d-10be-4495-a469-8b975f26754f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.945097 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29534925-7hrlx" event={"ID":"dd490e7f-4d64-4fc3-9c8d-17236968eafd","Type":"ContainerDied","Data":"93bbcde44ba175d0d29d683fe4b9f55d4580b90216daebb7b5d608d5839ff755"} Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.945140 4925 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93bbcde44ba175d0d29d683fe4b9f55d4580b90216daebb7b5d608d5839ff755" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.945145 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29534925-7hrlx" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.954509 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.954558 4925 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.954571 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57w97\" (UniqueName: \"kubernetes.io/projected/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-kube-api-access-57w97\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.954581 4925 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.954592 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.955358 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" event={"ID":"cd55dbfe-4f78-44d1-85a2-7871197c0874","Type":"ContainerStarted","Data":"634df8bf8ca2ca54a90260fe4853b58ca679f5e9b4d84254853f0e2c7a256c91"} Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.955395 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" event={"ID":"cd55dbfe-4f78-44d1-85a2-7871197c0874","Type":"ContainerStarted","Data":"4a2409a61ec89b1ccea8e356063146029a1631a3e2b117ba7445933d2d651f34"} Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.962298 4925 generic.go:334] "Generic (PLEG): container finished" podID="ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b" containerID="85e37e47369d4075feae2baf947a8d974c887948f10f1a1960d056ff0e3e55b3" exitCode=0 Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.962406 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" event={"ID":"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b","Type":"ContainerDied","Data":"85e37e47369d4075feae2baf947a8d974c887948f10f1a1960d056ff0e3e55b3"} Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.962441 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" event={"ID":"ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b","Type":"ContainerDied","Data":"d6da8dace305dfec133cb949118bc77600f463834ef16f2466341f777ddd45a0"} Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.962462 4925 scope.go:117] "RemoveContainer" containerID="85e37e47369d4075feae2baf947a8d974c887948f10f1a1960d056ff0e3e55b3" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.962601 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8hj5v" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.983880 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-5nm8c" podStartSLOduration=11.983857267 podStartE2EDuration="11.983857267s" podCreationTimestamp="2026-02-26 08:47:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:03.975964065 +0000 UTC m=+216.115538358" watchObservedRunningTime="2026-02-26 08:48:03.983857267 +0000 UTC m=+216.123431560" Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.989573 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x9xkw"] Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.990842 4925 generic.go:334] "Generic (PLEG): container finished" podID="b8aa32c8-5b82-46ac-a7e2-7d6399474aff" containerID="9f17818c45f5e5096157d6abdd941bb28b9a700fa55783eb4b58174eea28deb9" exitCode=0 Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.990935 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgq9g" event={"ID":"b8aa32c8-5b82-46ac-a7e2-7d6399474aff","Type":"ContainerDied","Data":"9f17818c45f5e5096157d6abdd941bb28b9a700fa55783eb4b58174eea28deb9"} Feb 26 08:48:03 crc kubenswrapper[4925]: I0226 08:48:03.990961 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgq9g" event={"ID":"b8aa32c8-5b82-46ac-a7e2-7d6399474aff","Type":"ContainerStarted","Data":"033e63046163c4208d399429317a7538136001b1e5de8831f43bbe95c1c21611"} Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:03.997496 4925 generic.go:334] "Generic (PLEG): container finished" podID="8136d8e6-24c6-404b-9c09-13ba16c0fd86" containerID="aa8841a096aaa06531f8bf0e7f68140e7d6df169cb37ac6ffaace6c4d39935e0" exitCode=0 Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:03.997831 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8136d8e6-24c6-404b-9c09-13ba16c0fd86","Type":"ContainerDied","Data":"aa8841a096aaa06531f8bf0e7f68140e7d6df169cb37ac6ffaace6c4d39935e0"} Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.013424 4925 generic.go:334] "Generic (PLEG): container finished" podID="60dcb438-eb57-425d-a9a0-c1dfe9a884e2" containerID="8e2e48cacc19942d2679edccd58011931af997a132f38d7c7f6c53c99d077de2" exitCode=0 Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.013464 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" event={"ID":"60dcb438-eb57-425d-a9a0-c1dfe9a884e2","Type":"ContainerDied","Data":"8e2e48cacc19942d2679edccd58011931af997a132f38d7c7f6c53c99d077de2"} Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.013491 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" event={"ID":"60dcb438-eb57-425d-a9a0-c1dfe9a884e2","Type":"ContainerDied","Data":"b7f3982b47bf25f4da2eed832850c63f13c1476afcd0e6eb976f6ce91bc7d037"} Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.013565 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.038893 4925 scope.go:117] "RemoveContainer" containerID="85e37e47369d4075feae2baf947a8d974c887948f10f1a1960d056ff0e3e55b3" Feb 26 08:48:04 crc kubenswrapper[4925]: E0226 08:48:04.043342 4925 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85e37e47369d4075feae2baf947a8d974c887948f10f1a1960d056ff0e3e55b3\": container with ID starting with 85e37e47369d4075feae2baf947a8d974c887948f10f1a1960d056ff0e3e55b3 not found: ID does not exist" containerID="85e37e47369d4075feae2baf947a8d974c887948f10f1a1960d056ff0e3e55b3" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.043392 4925 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85e37e47369d4075feae2baf947a8d974c887948f10f1a1960d056ff0e3e55b3"} err="failed to get container status \"85e37e47369d4075feae2baf947a8d974c887948f10f1a1960d056ff0e3e55b3\": rpc error: code = NotFound desc = could not find container \"85e37e47369d4075feae2baf947a8d974c887948f10f1a1960d056ff0e3e55b3\": container with ID starting with 85e37e47369d4075feae2baf947a8d974c887948f10f1a1960d056ff0e3e55b3 not found: ID does not exist" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.043422 4925 scope.go:117] "RemoveContainer" containerID="8e2e48cacc19942d2679edccd58011931af997a132f38d7c7f6c53c99d077de2" Feb 26 08:48:04 crc kubenswrapper[4925]: W0226 08:48:04.053918 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a8eea64_29d5_46bc_89db_23ad451eba4f.slice/crio-a67205620aa955172bda0954fb651162873aa1a19d3b7052cbf60e351f1c196c WatchSource:0}: Error finding container a67205620aa955172bda0954fb651162873aa1a19d3b7052cbf60e351f1c196c: Status 404 returned error can't find the container with id a67205620aa955172bda0954fb651162873aa1a19d3b7052cbf60e351f1c196c Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.053988 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8hj5v"] Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.062302 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8hj5v"] Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.069627 4925 scope.go:117] "RemoveContainer" containerID="8e2e48cacc19942d2679edccd58011931af997a132f38d7c7f6c53c99d077de2" Feb 26 08:48:04 crc kubenswrapper[4925]: E0226 08:48:04.071191 4925 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e2e48cacc19942d2679edccd58011931af997a132f38d7c7f6c53c99d077de2\": container with ID starting with 8e2e48cacc19942d2679edccd58011931af997a132f38d7c7f6c53c99d077de2 not found: ID does not exist" containerID="8e2e48cacc19942d2679edccd58011931af997a132f38d7c7f6c53c99d077de2" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.071231 4925 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e2e48cacc19942d2679edccd58011931af997a132f38d7c7f6c53c99d077de2"} err="failed to get container status \"8e2e48cacc19942d2679edccd58011931af997a132f38d7c7f6c53c99d077de2\": rpc error: code = NotFound desc = could not find container \"8e2e48cacc19942d2679edccd58011931af997a132f38d7c7f6c53c99d077de2\": container with ID starting with 8e2e48cacc19942d2679edccd58011931af997a132f38d7c7f6c53c99d077de2 not found: ID does not exist" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.072941 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gdzrh"] Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.076153 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb"] Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.078828 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-pv5bb"] Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.238235 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.239196 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.247324 4925 patch_prober.go:28] interesting pod/console-f9d7485db-kv2xl container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.247377 4925 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-kv2xl" podUID="7c3241c5-45b5-464d-866b-690bcedf1934" containerName="console" probeResult="failure" output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.250693 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.250831 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.259681 4925 patch_prober.go:28] interesting pod/apiserver-76f77b778f-nkgtm container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 26 08:48:04 crc kubenswrapper[4925]: [+]log ok Feb 26 08:48:04 crc kubenswrapper[4925]: [+]etcd ok Feb 26 08:48:04 crc kubenswrapper[4925]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 26 08:48:04 crc kubenswrapper[4925]: [+]poststarthook/generic-apiserver-start-informers ok Feb 26 08:48:04 crc kubenswrapper[4925]: [+]poststarthook/max-in-flight-filter ok Feb 26 08:48:04 crc kubenswrapper[4925]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 26 08:48:04 crc kubenswrapper[4925]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 26 08:48:04 crc kubenswrapper[4925]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 26 08:48:04 crc kubenswrapper[4925]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 26 08:48:04 crc kubenswrapper[4925]: [+]poststarthook/project.openshift.io-projectcache ok Feb 26 08:48:04 crc kubenswrapper[4925]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 26 08:48:04 crc kubenswrapper[4925]: [+]poststarthook/openshift.io-startinformers ok Feb 26 08:48:04 crc kubenswrapper[4925]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 26 08:48:04 crc kubenswrapper[4925]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 26 08:48:04 crc kubenswrapper[4925]: livez check failed Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.259718 4925 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" podUID="66d0db3d-0908-48d2-be73-e1afd790f010" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.262586 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c8g65"] Feb 26 08:48:04 crc kubenswrapper[4925]: W0226 08:48:04.270934 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf852adc_ded1_43ba_9ed4_cf31d6eda903.slice/crio-dbb0929885aa353ae66af1253dd8a209b5885d1362a9fad13ad620654abe7141 WatchSource:0}: Error finding container dbb0929885aa353ae66af1253dd8a209b5885d1362a9fad13ad620654abe7141: Status 404 returned error can't find the container with id dbb0929885aa353ae66af1253dd8a209b5885d1362a9fad13ad620654abe7141 Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.274993 4925 patch_prober.go:28] interesting pod/downloads-7954f5f757-b5l7b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.275032 4925 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-b5l7b" podUID="cd116d5f-0452-44f0-ac17-2a0e2e083b93" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.275134 4925 patch_prober.go:28] interesting pod/downloads-7954f5f757-b5l7b container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.275198 4925 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-b5l7b" podUID="cd116d5f-0452-44f0-ac17-2a0e2e083b93" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.532823 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60dcb438-eb57-425d-a9a0-c1dfe9a884e2" path="/var/lib/kubelet/pods/60dcb438-eb57-425d-a9a0-c1dfe9a884e2/volumes" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.533498 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.534818 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b" path="/var/lib/kubelet/pods/ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b/volumes" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.768746 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mrq8g"] Feb 26 08:48:04 crc kubenswrapper[4925]: E0226 08:48:04.769148 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60dcb438-eb57-425d-a9a0-c1dfe9a884e2" containerName="route-controller-manager" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.769170 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="60dcb438-eb57-425d-a9a0-c1dfe9a884e2" containerName="route-controller-manager" Feb 26 08:48:04 crc kubenswrapper[4925]: E0226 08:48:04.769205 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd490e7f-4d64-4fc3-9c8d-17236968eafd" containerName="collect-profiles" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.769217 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd490e7f-4d64-4fc3-9c8d-17236968eafd" containerName="collect-profiles" Feb 26 08:48:04 crc kubenswrapper[4925]: E0226 08:48:04.769237 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b" containerName="controller-manager" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.769244 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b" containerName="controller-manager" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.769396 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="60dcb438-eb57-425d-a9a0-c1dfe9a884e2" containerName="route-controller-manager" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.769410 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffe17e7a-f03b-4b6e-93a6-be7e5bf4d31b" containerName="controller-manager" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.769429 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd490e7f-4d64-4fc3-9c8d-17236968eafd" containerName="collect-profiles" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.771082 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mrq8g" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.771514 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mrq8g"] Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.775813 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.816225 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.816272 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.825121 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.843565 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp"] Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.844236 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.847157 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.847307 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.847421 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.849220 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.849603 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.850141 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.853236 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp"] Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.854489 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.859622 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.859681 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.859628 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.859802 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.860500 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.860617 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.866821 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.869612 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdjkf\" (UniqueName: \"kubernetes.io/projected/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-kube-api-access-wdjkf\") pod \"route-controller-manager-6f574cc6bf-52qpp\" (UID: \"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c\") " pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.869675 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-client-ca\") pod \"route-controller-manager-6f574cc6bf-52qpp\" (UID: \"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c\") " pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.870024 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nff7k\" (UniqueName: \"kubernetes.io/projected/ce17c01b-f0a9-4f82-9512-028c8f673f85-kube-api-access-nff7k\") pod \"redhat-marketplace-mrq8g\" (UID: \"ce17c01b-f0a9-4f82-9512-028c8f673f85\") " pod="openshift-marketplace/redhat-marketplace-mrq8g" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.870086 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-serving-cert\") pod \"route-controller-manager-6f574cc6bf-52qpp\" (UID: \"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c\") " pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.870114 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-config\") pod \"route-controller-manager-6f574cc6bf-52qpp\" (UID: \"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c\") " pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.870164 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce17c01b-f0a9-4f82-9512-028c8f673f85-utilities\") pod \"redhat-marketplace-mrq8g\" (UID: \"ce17c01b-f0a9-4f82-9512-028c8f673f85\") " pod="openshift-marketplace/redhat-marketplace-mrq8g" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.870231 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce17c01b-f0a9-4f82-9512-028c8f673f85-catalog-content\") pod \"redhat-marketplace-mrq8g\" (UID: \"ce17c01b-f0a9-4f82-9512-028c8f673f85\") " pod="openshift-marketplace/redhat-marketplace-mrq8g" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.873636 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp"] Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.896536 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp"] Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.921571 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-h5g2f" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.925855 4925 patch_prober.go:28] interesting pod/router-default-5444994796-h5g2f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 08:48:04 crc kubenswrapper[4925]: [-]has-synced failed: reason withheld Feb 26 08:48:04 crc kubenswrapper[4925]: [+]process-running ok Feb 26 08:48:04 crc kubenswrapper[4925]: healthz check failed Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.925937 4925 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5g2f" podUID="0fc3c69d-10be-4495-a469-8b975f26754f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.971347 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9dxs\" (UniqueName: \"kubernetes.io/projected/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-kube-api-access-c9dxs\") pod \"controller-manager-5dfb7878c9-ph7pp\" (UID: \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\") " pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.971400 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-serving-cert\") pod \"controller-manager-5dfb7878c9-ph7pp\" (UID: \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\") " pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.971476 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-client-ca\") pod \"route-controller-manager-6f574cc6bf-52qpp\" (UID: \"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c\") " pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.972773 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-client-ca\") pod \"route-controller-manager-6f574cc6bf-52qpp\" (UID: \"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c\") " pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.972849 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nff7k\" (UniqueName: \"kubernetes.io/projected/ce17c01b-f0a9-4f82-9512-028c8f673f85-kube-api-access-nff7k\") pod \"redhat-marketplace-mrq8g\" (UID: \"ce17c01b-f0a9-4f82-9512-028c8f673f85\") " pod="openshift-marketplace/redhat-marketplace-mrq8g" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.972923 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-serving-cert\") pod \"route-controller-manager-6f574cc6bf-52qpp\" (UID: \"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c\") " pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.973850 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-client-ca\") pod \"controller-manager-5dfb7878c9-ph7pp\" (UID: \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\") " pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.973887 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-config\") pod \"route-controller-manager-6f574cc6bf-52qpp\" (UID: \"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c\") " pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.973947 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce17c01b-f0a9-4f82-9512-028c8f673f85-utilities\") pod \"redhat-marketplace-mrq8g\" (UID: \"ce17c01b-f0a9-4f82-9512-028c8f673f85\") " pod="openshift-marketplace/redhat-marketplace-mrq8g" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.974469 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce17c01b-f0a9-4f82-9512-028c8f673f85-catalog-content\") pod \"redhat-marketplace-mrq8g\" (UID: \"ce17c01b-f0a9-4f82-9512-028c8f673f85\") " pod="openshift-marketplace/redhat-marketplace-mrq8g" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.974625 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-proxy-ca-bundles\") pod \"controller-manager-5dfb7878c9-ph7pp\" (UID: \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\") " pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.974685 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-config\") pod \"controller-manager-5dfb7878c9-ph7pp\" (UID: \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\") " pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.974790 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdjkf\" (UniqueName: \"kubernetes.io/projected/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-kube-api-access-wdjkf\") pod \"route-controller-manager-6f574cc6bf-52qpp\" (UID: \"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c\") " pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.975465 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-config\") pod \"route-controller-manager-6f574cc6bf-52qpp\" (UID: \"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c\") " pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.975600 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce17c01b-f0a9-4f82-9512-028c8f673f85-catalog-content\") pod \"redhat-marketplace-mrq8g\" (UID: \"ce17c01b-f0a9-4f82-9512-028c8f673f85\") " pod="openshift-marketplace/redhat-marketplace-mrq8g" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.974350 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce17c01b-f0a9-4f82-9512-028c8f673f85-utilities\") pod \"redhat-marketplace-mrq8g\" (UID: \"ce17c01b-f0a9-4f82-9512-028c8f673f85\") " pod="openshift-marketplace/redhat-marketplace-mrq8g" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.981607 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-serving-cert\") pod \"route-controller-manager-6f574cc6bf-52qpp\" (UID: \"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c\") " pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.992320 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nff7k\" (UniqueName: \"kubernetes.io/projected/ce17c01b-f0a9-4f82-9512-028c8f673f85-kube-api-access-nff7k\") pod \"redhat-marketplace-mrq8g\" (UID: \"ce17c01b-f0a9-4f82-9512-028c8f673f85\") " pod="openshift-marketplace/redhat-marketplace-mrq8g" Feb 26 08:48:04 crc kubenswrapper[4925]: I0226 08:48:04.995840 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdjkf\" (UniqueName: \"kubernetes.io/projected/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-kube-api-access-wdjkf\") pod \"route-controller-manager-6f574cc6bf-52qpp\" (UID: \"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c\") " pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.027202 4925 generic.go:334] "Generic (PLEG): container finished" podID="cf852adc-ded1-43ba-9ed4-cf31d6eda903" containerID="59a9b95829a6337e61ff88f289834a710d7a9f8d8df3e220e0b3574b15cdb996" exitCode=0 Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.027277 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8g65" event={"ID":"cf852adc-ded1-43ba-9ed4-cf31d6eda903","Type":"ContainerDied","Data":"59a9b95829a6337e61ff88f289834a710d7a9f8d8df3e220e0b3574b15cdb996"} Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.027304 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8g65" event={"ID":"cf852adc-ded1-43ba-9ed4-cf31d6eda903","Type":"ContainerStarted","Data":"dbb0929885aa353ae66af1253dd8a209b5885d1362a9fad13ad620654abe7141"} Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.044338 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" event={"ID":"dd43ed89-2634-4c7a-a030-a468d0b1c481","Type":"ContainerStarted","Data":"7234b8d5dc8243821856cce1a3a904e872e85fa81f26e74d8777756ec254db93"} Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.044385 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" event={"ID":"dd43ed89-2634-4c7a-a030-a468d0b1c481","Type":"ContainerStarted","Data":"f6c84832e5cc37a02c8bd503ac7c59606523f51f8dee9a8a9069b52698edcb21"} Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.044428 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.056039 4925 generic.go:334] "Generic (PLEG): container finished" podID="78d95f85-01b1-4a6b-8c48-b477f840035d" containerID="e717369ee2e3b75e644cb9877ce06bc29cdde42e69226b6b196e632d71fdc7d5" exitCode=0 Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.056175 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdzrh" event={"ID":"78d95f85-01b1-4a6b-8c48-b477f840035d","Type":"ContainerDied","Data":"e717369ee2e3b75e644cb9877ce06bc29cdde42e69226b6b196e632d71fdc7d5"} Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.056212 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdzrh" event={"ID":"78d95f85-01b1-4a6b-8c48-b477f840035d","Type":"ContainerStarted","Data":"498efbb87d26ef94b618f671b5a5c7cafc2ebf24ea483183bb730ee7a3980603"} Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.062880 4925 generic.go:334] "Generic (PLEG): container finished" podID="5a8eea64-29d5-46bc-89db-23ad451eba4f" containerID="58fd9e548586058b4734a234cde7c37a81649b2d5fd3089d74399ca701060647" exitCode=0 Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.063506 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x9xkw" event={"ID":"5a8eea64-29d5-46bc-89db-23ad451eba4f","Type":"ContainerDied","Data":"58fd9e548586058b4734a234cde7c37a81649b2d5fd3089d74399ca701060647"} Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.063634 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x9xkw" event={"ID":"5a8eea64-29d5-46bc-89db-23ad451eba4f","Type":"ContainerStarted","Data":"a67205620aa955172bda0954fb651162873aa1a19d3b7052cbf60e351f1c196c"} Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.070466 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" podStartSLOduration=169.070440947 podStartE2EDuration="2m49.070440947s" podCreationTimestamp="2026-02-26 08:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:05.067073015 +0000 UTC m=+217.206647308" watchObservedRunningTime="2026-02-26 08:48:05.070440947 +0000 UTC m=+217.210015230" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.070789 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lptdv" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.075777 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-client-ca\") pod \"controller-manager-5dfb7878c9-ph7pp\" (UID: \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\") " pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.075869 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-proxy-ca-bundles\") pod \"controller-manager-5dfb7878c9-ph7pp\" (UID: \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\") " pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.075892 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-config\") pod \"controller-manager-5dfb7878c9-ph7pp\" (UID: \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\") " pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.075961 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9dxs\" (UniqueName: \"kubernetes.io/projected/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-kube-api-access-c9dxs\") pod \"controller-manager-5dfb7878c9-ph7pp\" (UID: \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\") " pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.075980 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-serving-cert\") pod \"controller-manager-5dfb7878c9-ph7pp\" (UID: \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\") " pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.077170 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-client-ca\") pod \"controller-manager-5dfb7878c9-ph7pp\" (UID: \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\") " pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.080059 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-config\") pod \"controller-manager-5dfb7878c9-ph7pp\" (UID: \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\") " pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.080316 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-proxy-ca-bundles\") pod \"controller-manager-5dfb7878c9-ph7pp\" (UID: \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\") " pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.100366 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mrq8g" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.103504 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-serving-cert\") pod \"controller-manager-5dfb7878c9-ph7pp\" (UID: \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\") " pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.105438 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9dxs\" (UniqueName: \"kubernetes.io/projected/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-kube-api-access-c9dxs\") pod \"controller-manager-5dfb7878c9-ph7pp\" (UID: \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\") " pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.176014 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7pkxc"] Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.177261 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pkxc" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.191639 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.211643 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pkxc"] Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.218462 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.278200 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2h9v\" (UniqueName: \"kubernetes.io/projected/e99e124b-4917-49df-8c28-04373cfed9d3-kube-api-access-n2h9v\") pod \"redhat-marketplace-7pkxc\" (UID: \"e99e124b-4917-49df-8c28-04373cfed9d3\") " pod="openshift-marketplace/redhat-marketplace-7pkxc" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.278601 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e99e124b-4917-49df-8c28-04373cfed9d3-utilities\") pod \"redhat-marketplace-7pkxc\" (UID: \"e99e124b-4917-49df-8c28-04373cfed9d3\") " pod="openshift-marketplace/redhat-marketplace-7pkxc" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.278648 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e99e124b-4917-49df-8c28-04373cfed9d3-catalog-content\") pod \"redhat-marketplace-7pkxc\" (UID: \"e99e124b-4917-49df-8c28-04373cfed9d3\") " pod="openshift-marketplace/redhat-marketplace-7pkxc" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.382106 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e99e124b-4917-49df-8c28-04373cfed9d3-catalog-content\") pod \"redhat-marketplace-7pkxc\" (UID: \"e99e124b-4917-49df-8c28-04373cfed9d3\") " pod="openshift-marketplace/redhat-marketplace-7pkxc" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.382202 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2h9v\" (UniqueName: \"kubernetes.io/projected/e99e124b-4917-49df-8c28-04373cfed9d3-kube-api-access-n2h9v\") pod \"redhat-marketplace-7pkxc\" (UID: \"e99e124b-4917-49df-8c28-04373cfed9d3\") " pod="openshift-marketplace/redhat-marketplace-7pkxc" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.382225 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e99e124b-4917-49df-8c28-04373cfed9d3-utilities\") pod \"redhat-marketplace-7pkxc\" (UID: \"e99e124b-4917-49df-8c28-04373cfed9d3\") " pod="openshift-marketplace/redhat-marketplace-7pkxc" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.382950 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e99e124b-4917-49df-8c28-04373cfed9d3-utilities\") pod \"redhat-marketplace-7pkxc\" (UID: \"e99e124b-4917-49df-8c28-04373cfed9d3\") " pod="openshift-marketplace/redhat-marketplace-7pkxc" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.385721 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e99e124b-4917-49df-8c28-04373cfed9d3-catalog-content\") pod \"redhat-marketplace-7pkxc\" (UID: \"e99e124b-4917-49df-8c28-04373cfed9d3\") " pod="openshift-marketplace/redhat-marketplace-7pkxc" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.405743 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2h9v\" (UniqueName: \"kubernetes.io/projected/e99e124b-4917-49df-8c28-04373cfed9d3-kube-api-access-n2h9v\") pod \"redhat-marketplace-7pkxc\" (UID: \"e99e124b-4917-49df-8c28-04373cfed9d3\") " pod="openshift-marketplace/redhat-marketplace-7pkxc" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.445374 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mrq8g"] Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.459768 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.483323 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8136d8e6-24c6-404b-9c09-13ba16c0fd86-kubelet-dir\") pod \"8136d8e6-24c6-404b-9c09-13ba16c0fd86\" (UID: \"8136d8e6-24c6-404b-9c09-13ba16c0fd86\") " Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.483389 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8136d8e6-24c6-404b-9c09-13ba16c0fd86-kube-api-access\") pod \"8136d8e6-24c6-404b-9c09-13ba16c0fd86\" (UID: \"8136d8e6-24c6-404b-9c09-13ba16c0fd86\") " Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.484510 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8136d8e6-24c6-404b-9c09-13ba16c0fd86-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8136d8e6-24c6-404b-9c09-13ba16c0fd86" (UID: "8136d8e6-24c6-404b-9c09-13ba16c0fd86"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.494708 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8136d8e6-24c6-404b-9c09-13ba16c0fd86-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8136d8e6-24c6-404b-9c09-13ba16c0fd86" (UID: "8136d8e6-24c6-404b-9c09-13ba16c0fd86"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.532466 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pkxc" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.547872 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp"] Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.584790 4925 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8136d8e6-24c6-404b-9c09-13ba16c0fd86-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.584825 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8136d8e6-24c6-404b-9c09-13ba16c0fd86-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.757080 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zs499"] Feb 26 08:48:05 crc kubenswrapper[4925]: E0226 08:48:05.757519 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8136d8e6-24c6-404b-9c09-13ba16c0fd86" containerName="pruner" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.757543 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="8136d8e6-24c6-404b-9c09-13ba16c0fd86" containerName="pruner" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.757628 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="8136d8e6-24c6-404b-9c09-13ba16c0fd86" containerName="pruner" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.758377 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zs499" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.761339 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.769642 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zs499"] Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.794030 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvprv\" (UniqueName: \"kubernetes.io/projected/75ccca07-d75b-4817-b203-931dc51638f4-kube-api-access-tvprv\") pod \"redhat-operators-zs499\" (UID: \"75ccca07-d75b-4817-b203-931dc51638f4\") " pod="openshift-marketplace/redhat-operators-zs499" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.794121 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75ccca07-d75b-4817-b203-931dc51638f4-utilities\") pod \"redhat-operators-zs499\" (UID: \"75ccca07-d75b-4817-b203-931dc51638f4\") " pod="openshift-marketplace/redhat-operators-zs499" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.794169 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75ccca07-d75b-4817-b203-931dc51638f4-catalog-content\") pod \"redhat-operators-zs499\" (UID: \"75ccca07-d75b-4817-b203-931dc51638f4\") " pod="openshift-marketplace/redhat-operators-zs499" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.836595 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pkxc"] Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.847246 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp"] Feb 26 08:48:05 crc kubenswrapper[4925]: W0226 08:48:05.849898 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode99e124b_4917_49df_8c28_04373cfed9d3.slice/crio-a4d60917cc86ec24cba0908eaaa8c99c33bd776600af08cec7c88cf007a268cd WatchSource:0}: Error finding container a4d60917cc86ec24cba0908eaaa8c99c33bd776600af08cec7c88cf007a268cd: Status 404 returned error can't find the container with id a4d60917cc86ec24cba0908eaaa8c99c33bd776600af08cec7c88cf007a268cd Feb 26 08:48:05 crc kubenswrapper[4925]: W0226 08:48:05.856989 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0744c2bc_ddba_486e_a2b7_9d75767b0bd1.slice/crio-94dab44fef57dd2128680ea492d5e40070de21e7d33936f609193ac9f45ac228 WatchSource:0}: Error finding container 94dab44fef57dd2128680ea492d5e40070de21e7d33936f609193ac9f45ac228: Status 404 returned error can't find the container with id 94dab44fef57dd2128680ea492d5e40070de21e7d33936f609193ac9f45ac228 Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.896167 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvprv\" (UniqueName: \"kubernetes.io/projected/75ccca07-d75b-4817-b203-931dc51638f4-kube-api-access-tvprv\") pod \"redhat-operators-zs499\" (UID: \"75ccca07-d75b-4817-b203-931dc51638f4\") " pod="openshift-marketplace/redhat-operators-zs499" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.896776 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75ccca07-d75b-4817-b203-931dc51638f4-utilities\") pod \"redhat-operators-zs499\" (UID: \"75ccca07-d75b-4817-b203-931dc51638f4\") " pod="openshift-marketplace/redhat-operators-zs499" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.896845 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75ccca07-d75b-4817-b203-931dc51638f4-catalog-content\") pod \"redhat-operators-zs499\" (UID: \"75ccca07-d75b-4817-b203-931dc51638f4\") " pod="openshift-marketplace/redhat-operators-zs499" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.897600 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75ccca07-d75b-4817-b203-931dc51638f4-utilities\") pod \"redhat-operators-zs499\" (UID: \"75ccca07-d75b-4817-b203-931dc51638f4\") " pod="openshift-marketplace/redhat-operators-zs499" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.903802 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75ccca07-d75b-4817-b203-931dc51638f4-catalog-content\") pod \"redhat-operators-zs499\" (UID: \"75ccca07-d75b-4817-b203-931dc51638f4\") " pod="openshift-marketplace/redhat-operators-zs499" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.929018 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvprv\" (UniqueName: \"kubernetes.io/projected/75ccca07-d75b-4817-b203-931dc51638f4-kube-api-access-tvprv\") pod \"redhat-operators-zs499\" (UID: \"75ccca07-d75b-4817-b203-931dc51638f4\") " pod="openshift-marketplace/redhat-operators-zs499" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.939765 4925 patch_prober.go:28] interesting pod/router-default-5444994796-h5g2f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 08:48:05 crc kubenswrapper[4925]: [-]has-synced failed: reason withheld Feb 26 08:48:05 crc kubenswrapper[4925]: [+]process-running ok Feb 26 08:48:05 crc kubenswrapper[4925]: healthz check failed Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.939826 4925 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5g2f" podUID="0fc3c69d-10be-4495-a469-8b975f26754f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.975884 4925 ???:1] "http: TLS handshake error from 192.168.126.11:41782: no serving certificate available for the kubelet" Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.982218 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cnjt4"] Feb 26 08:48:05 crc kubenswrapper[4925]: I0226 08:48:05.983588 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cnjt4" Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:05.998300 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cnjt4"] Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:05.998310 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9drdq\" (UniqueName: \"kubernetes.io/projected/89b3dc84-3c4c-44fd-9cd3-1da9c62753e7-kube-api-access-9drdq\") pod \"redhat-operators-cnjt4\" (UID: \"89b3dc84-3c4c-44fd-9cd3-1da9c62753e7\") " pod="openshift-marketplace/redhat-operators-cnjt4" Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:05.998438 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b3dc84-3c4c-44fd-9cd3-1da9c62753e7-catalog-content\") pod \"redhat-operators-cnjt4\" (UID: \"89b3dc84-3c4c-44fd-9cd3-1da9c62753e7\") " pod="openshift-marketplace/redhat-operators-cnjt4" Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:05.998500 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b3dc84-3c4c-44fd-9cd3-1da9c62753e7-utilities\") pod \"redhat-operators-cnjt4\" (UID: \"89b3dc84-3c4c-44fd-9cd3-1da9c62753e7\") " pod="openshift-marketplace/redhat-operators-cnjt4" Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.071882 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" event={"ID":"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c","Type":"ContainerStarted","Data":"fa3e04c36b043e4e73b0eb4e82809b50edaf8a5f032f85df8a60aae6c89653a4"} Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.072396 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" event={"ID":"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c","Type":"ContainerStarted","Data":"e1dc7de75641193db05f422a5de087d81591725dfac6f1abbab447569af9dfc4"} Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.072705 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.073743 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8136d8e6-24c6-404b-9c09-13ba16c0fd86","Type":"ContainerDied","Data":"6220b5cb7e2a59a4ad7ef20736f031beed8be9da188a9b6fc9bab5a8579c5d9a"} Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.073777 4925 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6220b5cb7e2a59a4ad7ef20736f031beed8be9da188a9b6fc9bab5a8579c5d9a" Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.073825 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.083709 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" event={"ID":"0744c2bc-ddba-486e-a2b7-9d75767b0bd1","Type":"ContainerStarted","Data":"94dab44fef57dd2128680ea492d5e40070de21e7d33936f609193ac9f45ac228"} Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.085680 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zs499" Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.087505 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pkxc" event={"ID":"e99e124b-4917-49df-8c28-04373cfed9d3","Type":"ContainerStarted","Data":"af9d0b4d89aa903b43b9de531519739fcc789d4e9c864117ec46bca03a24e8bd"} Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.087925 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pkxc" event={"ID":"e99e124b-4917-49df-8c28-04373cfed9d3","Type":"ContainerStarted","Data":"a4d60917cc86ec24cba0908eaaa8c99c33bd776600af08cec7c88cf007a268cd"} Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.091549 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" podStartSLOduration=3.091517227 podStartE2EDuration="3.091517227s" podCreationTimestamp="2026-02-26 08:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:06.090478302 +0000 UTC m=+218.230052605" watchObservedRunningTime="2026-02-26 08:48:06.091517227 +0000 UTC m=+218.231091500" Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.096606 4925 generic.go:334] "Generic (PLEG): container finished" podID="ce17c01b-f0a9-4f82-9512-028c8f673f85" containerID="c8b1e07a0d3639741d9d0d68331d2a35def484baf505cd2b7b64279b21456d5d" exitCode=0 Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.096699 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mrq8g" event={"ID":"ce17c01b-f0a9-4f82-9512-028c8f673f85","Type":"ContainerDied","Data":"c8b1e07a0d3639741d9d0d68331d2a35def484baf505cd2b7b64279b21456d5d"} Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.096744 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mrq8g" event={"ID":"ce17c01b-f0a9-4f82-9512-028c8f673f85","Type":"ContainerStarted","Data":"39e049e87669dbaebcdddcf0c2cc23f541fd72c9f7ece7b55b5b84e1355be411"} Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.099869 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b3dc84-3c4c-44fd-9cd3-1da9c62753e7-catalog-content\") pod \"redhat-operators-cnjt4\" (UID: \"89b3dc84-3c4c-44fd-9cd3-1da9c62753e7\") " pod="openshift-marketplace/redhat-operators-cnjt4" Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.099948 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b3dc84-3c4c-44fd-9cd3-1da9c62753e7-utilities\") pod \"redhat-operators-cnjt4\" (UID: \"89b3dc84-3c4c-44fd-9cd3-1da9c62753e7\") " pod="openshift-marketplace/redhat-operators-cnjt4" Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.100022 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9drdq\" (UniqueName: \"kubernetes.io/projected/89b3dc84-3c4c-44fd-9cd3-1da9c62753e7-kube-api-access-9drdq\") pod \"redhat-operators-cnjt4\" (UID: \"89b3dc84-3c4c-44fd-9cd3-1da9c62753e7\") " pod="openshift-marketplace/redhat-operators-cnjt4" Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.100364 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b3dc84-3c4c-44fd-9cd3-1da9c62753e7-utilities\") pod \"redhat-operators-cnjt4\" (UID: \"89b3dc84-3c4c-44fd-9cd3-1da9c62753e7\") " pod="openshift-marketplace/redhat-operators-cnjt4" Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.100702 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b3dc84-3c4c-44fd-9cd3-1da9c62753e7-catalog-content\") pod \"redhat-operators-cnjt4\" (UID: \"89b3dc84-3c4c-44fd-9cd3-1da9c62753e7\") " pod="openshift-marketplace/redhat-operators-cnjt4" Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.142785 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9drdq\" (UniqueName: \"kubernetes.io/projected/89b3dc84-3c4c-44fd-9cd3-1da9c62753e7-kube-api-access-9drdq\") pod \"redhat-operators-cnjt4\" (UID: \"89b3dc84-3c4c-44fd-9cd3-1da9c62753e7\") " pod="openshift-marketplace/redhat-operators-cnjt4" Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.314128 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.332814 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cnjt4" Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.746005 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.747179 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.780144 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.780427 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.791661 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.795690 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cnjt4"] Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.889268 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zs499"] Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.930159 4925 patch_prober.go:28] interesting pod/router-default-5444994796-h5g2f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 08:48:06 crc kubenswrapper[4925]: [-]has-synced failed: reason withheld Feb 26 08:48:06 crc kubenswrapper[4925]: [+]process-running ok Feb 26 08:48:06 crc kubenswrapper[4925]: healthz check failed Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.930227 4925 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5g2f" podUID="0fc3c69d-10be-4495-a469-8b975f26754f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.936386 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a74fe342-ce05-4a6b-91e4-1b8979d69a8d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a74fe342-ce05-4a6b-91e4-1b8979d69a8d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 08:48:06 crc kubenswrapper[4925]: I0226 08:48:06.936422 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a74fe342-ce05-4a6b-91e4-1b8979d69a8d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a74fe342-ce05-4a6b-91e4-1b8979d69a8d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 08:48:06 crc kubenswrapper[4925]: W0226 08:48:06.937421 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75ccca07_d75b_4817_b203_931dc51638f4.slice/crio-1e5a2c8008f7435ae1f2deb5b407d50be54c6e6b45d48c61c4dfeac8f972e627 WatchSource:0}: Error finding container 1e5a2c8008f7435ae1f2deb5b407d50be54c6e6b45d48c61c4dfeac8f972e627: Status 404 returned error can't find the container with id 1e5a2c8008f7435ae1f2deb5b407d50be54c6e6b45d48c61c4dfeac8f972e627 Feb 26 08:48:07 crc kubenswrapper[4925]: I0226 08:48:07.037481 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a74fe342-ce05-4a6b-91e4-1b8979d69a8d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a74fe342-ce05-4a6b-91e4-1b8979d69a8d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 08:48:07 crc kubenswrapper[4925]: I0226 08:48:07.037518 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a74fe342-ce05-4a6b-91e4-1b8979d69a8d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a74fe342-ce05-4a6b-91e4-1b8979d69a8d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 08:48:07 crc kubenswrapper[4925]: I0226 08:48:07.038246 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a74fe342-ce05-4a6b-91e4-1b8979d69a8d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a74fe342-ce05-4a6b-91e4-1b8979d69a8d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 08:48:07 crc kubenswrapper[4925]: I0226 08:48:07.079391 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a74fe342-ce05-4a6b-91e4-1b8979d69a8d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a74fe342-ce05-4a6b-91e4-1b8979d69a8d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 08:48:07 crc kubenswrapper[4925]: I0226 08:48:07.119760 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs499" event={"ID":"75ccca07-d75b-4817-b203-931dc51638f4","Type":"ContainerStarted","Data":"1e5a2c8008f7435ae1f2deb5b407d50be54c6e6b45d48c61c4dfeac8f972e627"} Feb 26 08:48:07 crc kubenswrapper[4925]: I0226 08:48:07.137927 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" event={"ID":"0744c2bc-ddba-486e-a2b7-9d75767b0bd1","Type":"ContainerStarted","Data":"c20483edba6db330b6f94a97ffa8d8435bf819abf0a816ec1840dd1a6094fb9a"} Feb 26 08:48:07 crc kubenswrapper[4925]: I0226 08:48:07.139624 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 08:48:07 crc kubenswrapper[4925]: I0226 08:48:07.140056 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" Feb 26 08:48:07 crc kubenswrapper[4925]: I0226 08:48:07.143676 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cnjt4" event={"ID":"89b3dc84-3c4c-44fd-9cd3-1da9c62753e7","Type":"ContainerStarted","Data":"0e0b9da4888bcc4bd6d757b8bb8ecf603e4959bd40b8966ded4f6918c7c7e1c9"} Feb 26 08:48:07 crc kubenswrapper[4925]: I0226 08:48:07.149238 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" Feb 26 08:48:07 crc kubenswrapper[4925]: I0226 08:48:07.157560 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" podStartSLOduration=4.157545455 podStartE2EDuration="4.157545455s" podCreationTimestamp="2026-02-26 08:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:07.155560727 +0000 UTC m=+219.295135030" watchObservedRunningTime="2026-02-26 08:48:07.157545455 +0000 UTC m=+219.297119748" Feb 26 08:48:07 crc kubenswrapper[4925]: I0226 08:48:07.162033 4925 generic.go:334] "Generic (PLEG): container finished" podID="e99e124b-4917-49df-8c28-04373cfed9d3" containerID="af9d0b4d89aa903b43b9de531519739fcc789d4e9c864117ec46bca03a24e8bd" exitCode=0 Feb 26 08:48:07 crc kubenswrapper[4925]: I0226 08:48:07.163066 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pkxc" event={"ID":"e99e124b-4917-49df-8c28-04373cfed9d3","Type":"ContainerDied","Data":"af9d0b4d89aa903b43b9de531519739fcc789d4e9c864117ec46bca03a24e8bd"} Feb 26 08:48:07 crc kubenswrapper[4925]: I0226 08:48:07.608868 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 26 08:48:07 crc kubenswrapper[4925]: I0226 08:48:07.924997 4925 patch_prober.go:28] interesting pod/router-default-5444994796-h5g2f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 08:48:07 crc kubenswrapper[4925]: [-]has-synced failed: reason withheld Feb 26 08:48:07 crc kubenswrapper[4925]: [+]process-running ok Feb 26 08:48:07 crc kubenswrapper[4925]: healthz check failed Feb 26 08:48:07 crc kubenswrapper[4925]: I0226 08:48:07.925094 4925 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5g2f" podUID="0fc3c69d-10be-4495-a469-8b975f26754f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 08:48:08 crc kubenswrapper[4925]: I0226 08:48:08.170318 4925 generic.go:334] "Generic (PLEG): container finished" podID="75ccca07-d75b-4817-b203-931dc51638f4" containerID="8d15b13e080012da2c5ea19ae09f5b1e75cf083730a9853a8839af06a8a4cc24" exitCode=0 Feb 26 08:48:08 crc kubenswrapper[4925]: I0226 08:48:08.170409 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs499" event={"ID":"75ccca07-d75b-4817-b203-931dc51638f4","Type":"ContainerDied","Data":"8d15b13e080012da2c5ea19ae09f5b1e75cf083730a9853a8839af06a8a4cc24"} Feb 26 08:48:08 crc kubenswrapper[4925]: I0226 08:48:08.176093 4925 generic.go:334] "Generic (PLEG): container finished" podID="89b3dc84-3c4c-44fd-9cd3-1da9c62753e7" containerID="aabd394629694064e97e24c7db8f0e6fecdba14a42a632ac2bd400e9ab9ddbdf" exitCode=0 Feb 26 08:48:08 crc kubenswrapper[4925]: I0226 08:48:08.176162 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cnjt4" event={"ID":"89b3dc84-3c4c-44fd-9cd3-1da9c62753e7","Type":"ContainerDied","Data":"aabd394629694064e97e24c7db8f0e6fecdba14a42a632ac2bd400e9ab9ddbdf"} Feb 26 08:48:08 crc kubenswrapper[4925]: I0226 08:48:08.177599 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a74fe342-ce05-4a6b-91e4-1b8979d69a8d","Type":"ContainerStarted","Data":"fdcfc80408a5eed19259e62ca59652ad0e8d8162633386cec08010ce623261e5"} Feb 26 08:48:08 crc kubenswrapper[4925]: I0226 08:48:08.923054 4925 patch_prober.go:28] interesting pod/router-default-5444994796-h5g2f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 08:48:08 crc kubenswrapper[4925]: [-]has-synced failed: reason withheld Feb 26 08:48:08 crc kubenswrapper[4925]: [+]process-running ok Feb 26 08:48:08 crc kubenswrapper[4925]: healthz check failed Feb 26 08:48:08 crc kubenswrapper[4925]: I0226 08:48:08.923122 4925 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5g2f" podUID="0fc3c69d-10be-4495-a469-8b975f26754f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 08:48:09 crc kubenswrapper[4925]: I0226 08:48:09.178614 4925 ???:1] "http: TLS handshake error from 192.168.126.11:41798: no serving certificate available for the kubelet" Feb 26 08:48:09 crc kubenswrapper[4925]: I0226 08:48:09.185733 4925 generic.go:334] "Generic (PLEG): container finished" podID="a74fe342-ce05-4a6b-91e4-1b8979d69a8d" containerID="ef679b7bdf3e40bbbd1e915cd7367ec6ad24c64eb6e0e4a477a78ec5484ad332" exitCode=0 Feb 26 08:48:09 crc kubenswrapper[4925]: I0226 08:48:09.185877 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a74fe342-ce05-4a6b-91e4-1b8979d69a8d","Type":"ContainerDied","Data":"ef679b7bdf3e40bbbd1e915cd7367ec6ad24c64eb6e0e4a477a78ec5484ad332"} Feb 26 08:48:09 crc kubenswrapper[4925]: I0226 08:48:09.265202 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:48:09 crc kubenswrapper[4925]: I0226 08:48:09.270413 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-nkgtm" Feb 26 08:48:09 crc kubenswrapper[4925]: I0226 08:48:09.923744 4925 patch_prober.go:28] interesting pod/router-default-5444994796-h5g2f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 08:48:09 crc kubenswrapper[4925]: [-]has-synced failed: reason withheld Feb 26 08:48:09 crc kubenswrapper[4925]: [+]process-running ok Feb 26 08:48:09 crc kubenswrapper[4925]: healthz check failed Feb 26 08:48:09 crc kubenswrapper[4925]: I0226 08:48:09.924123 4925 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5g2f" podUID="0fc3c69d-10be-4495-a469-8b975f26754f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 08:48:10 crc kubenswrapper[4925]: I0226 08:48:10.411518 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hjqhw" Feb 26 08:48:10 crc kubenswrapper[4925]: I0226 08:48:10.926520 4925 patch_prober.go:28] interesting pod/router-default-5444994796-h5g2f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 08:48:10 crc kubenswrapper[4925]: [-]has-synced failed: reason withheld Feb 26 08:48:10 crc kubenswrapper[4925]: [+]process-running ok Feb 26 08:48:10 crc kubenswrapper[4925]: healthz check failed Feb 26 08:48:10 crc kubenswrapper[4925]: I0226 08:48:10.927587 4925 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5g2f" podUID="0fc3c69d-10be-4495-a469-8b975f26754f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 08:48:11 crc kubenswrapper[4925]: I0226 08:48:11.118967 4925 ???:1] "http: TLS handshake error from 192.168.126.11:41804: no serving certificate available for the kubelet" Feb 26 08:48:11 crc kubenswrapper[4925]: I0226 08:48:11.924745 4925 patch_prober.go:28] interesting pod/router-default-5444994796-h5g2f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 08:48:11 crc kubenswrapper[4925]: [-]has-synced failed: reason withheld Feb 26 08:48:11 crc kubenswrapper[4925]: [+]process-running ok Feb 26 08:48:11 crc kubenswrapper[4925]: healthz check failed Feb 26 08:48:11 crc kubenswrapper[4925]: I0226 08:48:11.924828 4925 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5g2f" podUID="0fc3c69d-10be-4495-a469-8b975f26754f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 08:48:12 crc kubenswrapper[4925]: I0226 08:48:12.923437 4925 patch_prober.go:28] interesting pod/router-default-5444994796-h5g2f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 08:48:12 crc kubenswrapper[4925]: [-]has-synced failed: reason withheld Feb 26 08:48:12 crc kubenswrapper[4925]: [+]process-running ok Feb 26 08:48:12 crc kubenswrapper[4925]: healthz check failed Feb 26 08:48:12 crc kubenswrapper[4925]: I0226 08:48:12.923644 4925 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5g2f" podUID="0fc3c69d-10be-4495-a469-8b975f26754f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 08:48:13 crc kubenswrapper[4925]: I0226 08:48:13.923866 4925 patch_prober.go:28] interesting pod/router-default-5444994796-h5g2f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 08:48:13 crc kubenswrapper[4925]: [-]has-synced failed: reason withheld Feb 26 08:48:13 crc kubenswrapper[4925]: [+]process-running ok Feb 26 08:48:13 crc kubenswrapper[4925]: healthz check failed Feb 26 08:48:13 crc kubenswrapper[4925]: I0226 08:48:13.923925 4925 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5g2f" podUID="0fc3c69d-10be-4495-a469-8b975f26754f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 08:48:14 crc kubenswrapper[4925]: I0226 08:48:14.238359 4925 patch_prober.go:28] interesting pod/console-f9d7485db-kv2xl container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Feb 26 08:48:14 crc kubenswrapper[4925]: I0226 08:48:14.238429 4925 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-kv2xl" podUID="7c3241c5-45b5-464d-866b-690bcedf1934" containerName="console" probeResult="failure" output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" Feb 26 08:48:14 crc kubenswrapper[4925]: I0226 08:48:14.275160 4925 patch_prober.go:28] interesting pod/downloads-7954f5f757-b5l7b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 26 08:48:14 crc kubenswrapper[4925]: I0226 08:48:14.275207 4925 patch_prober.go:28] interesting pod/downloads-7954f5f757-b5l7b container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 26 08:48:14 crc kubenswrapper[4925]: I0226 08:48:14.275218 4925 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-b5l7b" podUID="cd116d5f-0452-44f0-ac17-2a0e2e083b93" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 26 08:48:14 crc kubenswrapper[4925]: I0226 08:48:14.275253 4925 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-b5l7b" podUID="cd116d5f-0452-44f0-ac17-2a0e2e083b93" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 26 08:48:14 crc kubenswrapper[4925]: I0226 08:48:14.925494 4925 patch_prober.go:28] interesting pod/router-default-5444994796-h5g2f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 08:48:14 crc kubenswrapper[4925]: [-]has-synced failed: reason withheld Feb 26 08:48:14 crc kubenswrapper[4925]: [+]process-running ok Feb 26 08:48:14 crc kubenswrapper[4925]: healthz check failed Feb 26 08:48:14 crc kubenswrapper[4925]: I0226 08:48:14.925564 4925 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5g2f" podUID="0fc3c69d-10be-4495-a469-8b975f26754f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 08:48:15 crc kubenswrapper[4925]: I0226 08:48:15.873067 4925 patch_prober.go:28] interesting pod/machine-config-daemon-s6pcl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 08:48:15 crc kubenswrapper[4925]: I0226 08:48:15.873465 4925 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" podUID="52bdf0f6-9afd-42fe-9a30-7da82bc7f619" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 08:48:15 crc kubenswrapper[4925]: I0226 08:48:15.927041 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-h5g2f" Feb 26 08:48:15 crc kubenswrapper[4925]: I0226 08:48:15.946094 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-h5g2f" Feb 26 08:48:15 crc kubenswrapper[4925]: I0226 08:48:15.951336 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pv859" Feb 26 08:48:16 crc kubenswrapper[4925]: I0226 08:48:16.946690 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 08:48:17 crc kubenswrapper[4925]: I0226 08:48:17.118937 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a74fe342-ce05-4a6b-91e4-1b8979d69a8d-kubelet-dir\") pod \"a74fe342-ce05-4a6b-91e4-1b8979d69a8d\" (UID: \"a74fe342-ce05-4a6b-91e4-1b8979d69a8d\") " Feb 26 08:48:17 crc kubenswrapper[4925]: I0226 08:48:17.119044 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a74fe342-ce05-4a6b-91e4-1b8979d69a8d-kube-api-access\") pod \"a74fe342-ce05-4a6b-91e4-1b8979d69a8d\" (UID: \"a74fe342-ce05-4a6b-91e4-1b8979d69a8d\") " Feb 26 08:48:17 crc kubenswrapper[4925]: I0226 08:48:17.119288 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a74fe342-ce05-4a6b-91e4-1b8979d69a8d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a74fe342-ce05-4a6b-91e4-1b8979d69a8d" (UID: "a74fe342-ce05-4a6b-91e4-1b8979d69a8d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 08:48:17 crc kubenswrapper[4925]: I0226 08:48:17.129803 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a74fe342-ce05-4a6b-91e4-1b8979d69a8d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a74fe342-ce05-4a6b-91e4-1b8979d69a8d" (UID: "a74fe342-ce05-4a6b-91e4-1b8979d69a8d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:48:17 crc kubenswrapper[4925]: I0226 08:48:17.220504 4925 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a74fe342-ce05-4a6b-91e4-1b8979d69a8d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:17 crc kubenswrapper[4925]: I0226 08:48:17.220552 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a74fe342-ce05-4a6b-91e4-1b8979d69a8d-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:17 crc kubenswrapper[4925]: I0226 08:48:17.255484 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a74fe342-ce05-4a6b-91e4-1b8979d69a8d","Type":"ContainerDied","Data":"fdcfc80408a5eed19259e62ca59652ad0e8d8162633386cec08010ce623261e5"} Feb 26 08:48:17 crc kubenswrapper[4925]: I0226 08:48:17.255560 4925 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdcfc80408a5eed19259e62ca59652ad0e8d8162633386cec08010ce623261e5" Feb 26 08:48:17 crc kubenswrapper[4925]: I0226 08:48:17.255653 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 08:48:21 crc kubenswrapper[4925]: I0226 08:48:21.746704 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp"] Feb 26 08:48:21 crc kubenswrapper[4925]: I0226 08:48:21.747553 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" podUID="0744c2bc-ddba-486e-a2b7-9d75767b0bd1" containerName="controller-manager" containerID="cri-o://c20483edba6db330b6f94a97ffa8d8435bf819abf0a816ec1840dd1a6094fb9a" gracePeriod=30 Feb 26 08:48:21 crc kubenswrapper[4925]: I0226 08:48:21.750672 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp"] Feb 26 08:48:21 crc kubenswrapper[4925]: I0226 08:48:21.750994 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" podUID="0f6c8eec-e24c-43c9-a0d5-a7e9543d456c" containerName="route-controller-manager" containerID="cri-o://fa3e04c36b043e4e73b0eb4e82809b50edaf8a5f032f85df8a60aae6c89653a4" gracePeriod=30 Feb 26 08:48:23 crc kubenswrapper[4925]: I0226 08:48:23.325935 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:48:24 crc kubenswrapper[4925]: I0226 08:48:24.289061 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-b5l7b" Feb 26 08:48:24 crc kubenswrapper[4925]: I0226 08:48:24.296582 4925 generic.go:334] "Generic (PLEG): container finished" podID="0744c2bc-ddba-486e-a2b7-9d75767b0bd1" containerID="c20483edba6db330b6f94a97ffa8d8435bf819abf0a816ec1840dd1a6094fb9a" exitCode=0 Feb 26 08:48:24 crc kubenswrapper[4925]: I0226 08:48:24.296633 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" event={"ID":"0744c2bc-ddba-486e-a2b7-9d75767b0bd1","Type":"ContainerDied","Data":"c20483edba6db330b6f94a97ffa8d8435bf819abf0a816ec1840dd1a6094fb9a"} Feb 26 08:48:24 crc kubenswrapper[4925]: I0226 08:48:24.466907 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:48:24 crc kubenswrapper[4925]: I0226 08:48:24.472762 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-kv2xl" Feb 26 08:48:25 crc kubenswrapper[4925]: I0226 08:48:25.193158 4925 patch_prober.go:28] interesting pod/route-controller-manager-6f574cc6bf-52qpp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Feb 26 08:48:25 crc kubenswrapper[4925]: I0226 08:48:25.193264 4925 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" podUID="0f6c8eec-e24c-43c9-a0d5-a7e9543d456c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Feb 26 08:48:25 crc kubenswrapper[4925]: I0226 08:48:25.220116 4925 patch_prober.go:28] interesting pod/controller-manager-5dfb7878c9-ph7pp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Feb 26 08:48:25 crc kubenswrapper[4925]: I0226 08:48:25.220187 4925 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" podUID="0744c2bc-ddba-486e-a2b7-9d75767b0bd1" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Feb 26 08:48:25 crc kubenswrapper[4925]: I0226 08:48:25.310402 4925 generic.go:334] "Generic (PLEG): container finished" podID="0f6c8eec-e24c-43c9-a0d5-a7e9543d456c" containerID="fa3e04c36b043e4e73b0eb4e82809b50edaf8a5f032f85df8a60aae6c89653a4" exitCode=0 Feb 26 08:48:25 crc kubenswrapper[4925]: I0226 08:48:25.310815 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" event={"ID":"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c","Type":"ContainerDied","Data":"fa3e04c36b043e4e73b0eb4e82809b50edaf8a5f032f85df8a60aae6c89653a4"} Feb 26 08:48:31 crc kubenswrapper[4925]: I0226 08:48:31.623270 4925 ???:1] "http: TLS handshake error from 192.168.126.11:34172: no serving certificate available for the kubelet" Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.718662 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.744704 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-56df47f7bc-k4ckp"] Feb 26 08:48:32 crc kubenswrapper[4925]: E0226 08:48:32.744952 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a74fe342-ce05-4a6b-91e4-1b8979d69a8d" containerName="pruner" Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.744967 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="a74fe342-ce05-4a6b-91e4-1b8979d69a8d" containerName="pruner" Feb 26 08:48:32 crc kubenswrapper[4925]: E0226 08:48:32.744984 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0744c2bc-ddba-486e-a2b7-9d75767b0bd1" containerName="controller-manager" Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.744992 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="0744c2bc-ddba-486e-a2b7-9d75767b0bd1" containerName="controller-manager" Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.745168 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="a74fe342-ce05-4a6b-91e4-1b8979d69a8d" containerName="pruner" Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.745211 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="0744c2bc-ddba-486e-a2b7-9d75767b0bd1" containerName="controller-manager" Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.746038 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.752578 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56df47f7bc-k4ckp"] Feb 26 08:48:32 crc kubenswrapper[4925]: E0226 08:48:32.819318 4925 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 26 08:48:32 crc kubenswrapper[4925]: E0226 08:48:32.819672 4925 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 26 08:48:32 crc kubenswrapper[4925]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 26 08:48:32 crc kubenswrapper[4925]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8gg9g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29534926-k666h_openshift-infra(0e938a2d-afc6-4360-aa50-e93c71d2c8c3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Feb 26 08:48:32 crc kubenswrapper[4925]: > logger="UnhandledError" Feb 26 08:48:32 crc kubenswrapper[4925]: E0226 08:48:32.821174 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29534926-k666h" podUID="0e938a2d-afc6-4360-aa50-e93c71d2c8c3" Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.837891 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-client-ca\") pod \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\" (UID: \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\") " Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.838043 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-serving-cert\") pod \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\" (UID: \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\") " Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.838075 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-proxy-ca-bundles\") pod \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\" (UID: \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\") " Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.838102 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-config\") pod \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\" (UID: \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\") " Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.838152 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9dxs\" (UniqueName: \"kubernetes.io/projected/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-kube-api-access-c9dxs\") pod \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\" (UID: \"0744c2bc-ddba-486e-a2b7-9d75767b0bd1\") " Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.838785 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-client-ca" (OuterVolumeSpecName: "client-ca") pod "0744c2bc-ddba-486e-a2b7-9d75767b0bd1" (UID: "0744c2bc-ddba-486e-a2b7-9d75767b0bd1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.839178 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0744c2bc-ddba-486e-a2b7-9d75767b0bd1" (UID: "0744c2bc-ddba-486e-a2b7-9d75767b0bd1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.839258 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-config" (OuterVolumeSpecName: "config") pod "0744c2bc-ddba-486e-a2b7-9d75767b0bd1" (UID: "0744c2bc-ddba-486e-a2b7-9d75767b0bd1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.843723 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0744c2bc-ddba-486e-a2b7-9d75767b0bd1" (UID: "0744c2bc-ddba-486e-a2b7-9d75767b0bd1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.844003 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-kube-api-access-c9dxs" (OuterVolumeSpecName: "kube-api-access-c9dxs") pod "0744c2bc-ddba-486e-a2b7-9d75767b0bd1" (UID: "0744c2bc-ddba-486e-a2b7-9d75767b0bd1"). InnerVolumeSpecName "kube-api-access-c9dxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.939871 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hpxl\" (UniqueName: \"kubernetes.io/projected/c30177a7-9445-4e51-94c7-b1a86984ad4b-kube-api-access-2hpxl\") pod \"controller-manager-56df47f7bc-k4ckp\" (UID: \"c30177a7-9445-4e51-94c7-b1a86984ad4b\") " pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.939933 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c30177a7-9445-4e51-94c7-b1a86984ad4b-client-ca\") pod \"controller-manager-56df47f7bc-k4ckp\" (UID: \"c30177a7-9445-4e51-94c7-b1a86984ad4b\") " pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.939969 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c30177a7-9445-4e51-94c7-b1a86984ad4b-config\") pod \"controller-manager-56df47f7bc-k4ckp\" (UID: \"c30177a7-9445-4e51-94c7-b1a86984ad4b\") " pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.940037 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c30177a7-9445-4e51-94c7-b1a86984ad4b-serving-cert\") pod \"controller-manager-56df47f7bc-k4ckp\" (UID: \"c30177a7-9445-4e51-94c7-b1a86984ad4b\") " pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.940142 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c30177a7-9445-4e51-94c7-b1a86984ad4b-proxy-ca-bundles\") pod \"controller-manager-56df47f7bc-k4ckp\" (UID: \"c30177a7-9445-4e51-94c7-b1a86984ad4b\") " pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.940195 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.940211 4925 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.940225 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.940235 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9dxs\" (UniqueName: \"kubernetes.io/projected/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-kube-api-access-c9dxs\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:32 crc kubenswrapper[4925]: I0226 08:48:32.940243 4925 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0744c2bc-ddba-486e-a2b7-9d75767b0bd1-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:33 crc kubenswrapper[4925]: I0226 08:48:33.041743 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c30177a7-9445-4e51-94c7-b1a86984ad4b-proxy-ca-bundles\") pod \"controller-manager-56df47f7bc-k4ckp\" (UID: \"c30177a7-9445-4e51-94c7-b1a86984ad4b\") " pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" Feb 26 08:48:33 crc kubenswrapper[4925]: I0226 08:48:33.041828 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hpxl\" (UniqueName: \"kubernetes.io/projected/c30177a7-9445-4e51-94c7-b1a86984ad4b-kube-api-access-2hpxl\") pod \"controller-manager-56df47f7bc-k4ckp\" (UID: \"c30177a7-9445-4e51-94c7-b1a86984ad4b\") " pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" Feb 26 08:48:33 crc kubenswrapper[4925]: I0226 08:48:33.041911 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c30177a7-9445-4e51-94c7-b1a86984ad4b-client-ca\") pod \"controller-manager-56df47f7bc-k4ckp\" (UID: \"c30177a7-9445-4e51-94c7-b1a86984ad4b\") " pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" Feb 26 08:48:33 crc kubenswrapper[4925]: I0226 08:48:33.041982 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c30177a7-9445-4e51-94c7-b1a86984ad4b-config\") pod \"controller-manager-56df47f7bc-k4ckp\" (UID: \"c30177a7-9445-4e51-94c7-b1a86984ad4b\") " pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" Feb 26 08:48:33 crc kubenswrapper[4925]: I0226 08:48:33.042008 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c30177a7-9445-4e51-94c7-b1a86984ad4b-serving-cert\") pod \"controller-manager-56df47f7bc-k4ckp\" (UID: \"c30177a7-9445-4e51-94c7-b1a86984ad4b\") " pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" Feb 26 08:48:33 crc kubenswrapper[4925]: I0226 08:48:33.043041 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c30177a7-9445-4e51-94c7-b1a86984ad4b-client-ca\") pod \"controller-manager-56df47f7bc-k4ckp\" (UID: \"c30177a7-9445-4e51-94c7-b1a86984ad4b\") " pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" Feb 26 08:48:33 crc kubenswrapper[4925]: I0226 08:48:33.043753 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c30177a7-9445-4e51-94c7-b1a86984ad4b-config\") pod \"controller-manager-56df47f7bc-k4ckp\" (UID: \"c30177a7-9445-4e51-94c7-b1a86984ad4b\") " pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" Feb 26 08:48:33 crc kubenswrapper[4925]: I0226 08:48:33.043805 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c30177a7-9445-4e51-94c7-b1a86984ad4b-proxy-ca-bundles\") pod \"controller-manager-56df47f7bc-k4ckp\" (UID: \"c30177a7-9445-4e51-94c7-b1a86984ad4b\") " pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" Feb 26 08:48:33 crc kubenswrapper[4925]: I0226 08:48:33.056686 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c30177a7-9445-4e51-94c7-b1a86984ad4b-serving-cert\") pod \"controller-manager-56df47f7bc-k4ckp\" (UID: \"c30177a7-9445-4e51-94c7-b1a86984ad4b\") " pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" Feb 26 08:48:33 crc kubenswrapper[4925]: I0226 08:48:33.059007 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hpxl\" (UniqueName: \"kubernetes.io/projected/c30177a7-9445-4e51-94c7-b1a86984ad4b-kube-api-access-2hpxl\") pod \"controller-manager-56df47f7bc-k4ckp\" (UID: \"c30177a7-9445-4e51-94c7-b1a86984ad4b\") " pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" Feb 26 08:48:33 crc kubenswrapper[4925]: I0226 08:48:33.080920 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" Feb 26 08:48:33 crc kubenswrapper[4925]: E0226 08:48:33.102947 4925 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 26 08:48:33 crc kubenswrapper[4925]: E0226 08:48:33.103070 4925 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 26 08:48:33 crc kubenswrapper[4925]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 26 08:48:33 crc kubenswrapper[4925]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b2qjw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29534928-pxjft_openshift-infra(b2a81774-e811-40f9-a623-c50a6bcd5d7b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Feb 26 08:48:33 crc kubenswrapper[4925]: > logger="UnhandledError" Feb 26 08:48:33 crc kubenswrapper[4925]: E0226 08:48:33.104253 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29534928-pxjft" podUID="b2a81774-e811-40f9-a623-c50a6bcd5d7b" Feb 26 08:48:33 crc kubenswrapper[4925]: I0226 08:48:33.358896 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" event={"ID":"0744c2bc-ddba-486e-a2b7-9d75767b0bd1","Type":"ContainerDied","Data":"94dab44fef57dd2128680ea492d5e40070de21e7d33936f609193ac9f45ac228"} Feb 26 08:48:33 crc kubenswrapper[4925]: I0226 08:48:33.359061 4925 scope.go:117] "RemoveContainer" containerID="c20483edba6db330b6f94a97ffa8d8435bf819abf0a816ec1840dd1a6094fb9a" Feb 26 08:48:33 crc kubenswrapper[4925]: I0226 08:48:33.359459 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp" Feb 26 08:48:33 crc kubenswrapper[4925]: E0226 08:48:33.360980 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29534926-k666h" podUID="0e938a2d-afc6-4360-aa50-e93c71d2c8c3" Feb 26 08:48:33 crc kubenswrapper[4925]: E0226 08:48:33.361269 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29534928-pxjft" podUID="b2a81774-e811-40f9-a623-c50a6bcd5d7b" Feb 26 08:48:33 crc kubenswrapper[4925]: I0226 08:48:33.410499 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp"] Feb 26 08:48:33 crc kubenswrapper[4925]: I0226 08:48:33.412774 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5dfb7878c9-ph7pp"] Feb 26 08:48:34 crc kubenswrapper[4925]: I0226 08:48:34.041166 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 08:48:34 crc kubenswrapper[4925]: I0226 08:48:34.514115 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0744c2bc-ddba-486e-a2b7-9d75767b0bd1" path="/var/lib/kubelet/pods/0744c2bc-ddba-486e-a2b7-9d75767b0bd1/volumes" Feb 26 08:48:35 crc kubenswrapper[4925]: I0226 08:48:35.343922 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-fgfsf" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.193586 4925 patch_prober.go:28] interesting pod/route-controller-manager-6f574cc6bf-52qpp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.193893 4925 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" podUID="0f6c8eec-e24c-43c9-a0d5-a7e9543d456c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.557754 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.588720 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26"] Feb 26 08:48:36 crc kubenswrapper[4925]: E0226 08:48:36.589426 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f6c8eec-e24c-43c9-a0d5-a7e9543d456c" containerName="route-controller-manager" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.589469 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6c8eec-e24c-43c9-a0d5-a7e9543d456c" containerName="route-controller-manager" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.589862 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f6c8eec-e24c-43c9-a0d5-a7e9543d456c" containerName="route-controller-manager" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.590521 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.600327 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26"] Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.683008 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-config\") pod \"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c\" (UID: \"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c\") " Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.683132 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdjkf\" (UniqueName: \"kubernetes.io/projected/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-kube-api-access-wdjkf\") pod \"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c\" (UID: \"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c\") " Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.683157 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-client-ca\") pod \"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c\" (UID: \"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c\") " Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.683177 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-serving-cert\") pod \"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c\" (UID: \"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c\") " Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.684391 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-client-ca" (OuterVolumeSpecName: "client-ca") pod "0f6c8eec-e24c-43c9-a0d5-a7e9543d456c" (UID: "0f6c8eec-e24c-43c9-a0d5-a7e9543d456c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.684452 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-config" (OuterVolumeSpecName: "config") pod "0f6c8eec-e24c-43c9-a0d5-a7e9543d456c" (UID: "0f6c8eec-e24c-43c9-a0d5-a7e9543d456c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.690393 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-kube-api-access-wdjkf" (OuterVolumeSpecName: "kube-api-access-wdjkf") pod "0f6c8eec-e24c-43c9-a0d5-a7e9543d456c" (UID: "0f6c8eec-e24c-43c9-a0d5-a7e9543d456c"). InnerVolumeSpecName "kube-api-access-wdjkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.690501 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0f6c8eec-e24c-43c9-a0d5-a7e9543d456c" (UID: "0f6c8eec-e24c-43c9-a0d5-a7e9543d456c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.783986 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a58b8632-0501-4eac-8432-b67dda393366-client-ca\") pod \"route-controller-manager-79686b7648-l4j26\" (UID: \"a58b8632-0501-4eac-8432-b67dda393366\") " pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.784058 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dp6n\" (UniqueName: \"kubernetes.io/projected/a58b8632-0501-4eac-8432-b67dda393366-kube-api-access-5dp6n\") pod \"route-controller-manager-79686b7648-l4j26\" (UID: \"a58b8632-0501-4eac-8432-b67dda393366\") " pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.784101 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a58b8632-0501-4eac-8432-b67dda393366-serving-cert\") pod \"route-controller-manager-79686b7648-l4j26\" (UID: \"a58b8632-0501-4eac-8432-b67dda393366\") " pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.784409 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a58b8632-0501-4eac-8432-b67dda393366-config\") pod \"route-controller-manager-79686b7648-l4j26\" (UID: \"a58b8632-0501-4eac-8432-b67dda393366\") " pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.784706 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdjkf\" (UniqueName: \"kubernetes.io/projected/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-kube-api-access-wdjkf\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.784738 4925 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.784757 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.784771 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.885502 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a58b8632-0501-4eac-8432-b67dda393366-client-ca\") pod \"route-controller-manager-79686b7648-l4j26\" (UID: \"a58b8632-0501-4eac-8432-b67dda393366\") " pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.885789 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dp6n\" (UniqueName: \"kubernetes.io/projected/a58b8632-0501-4eac-8432-b67dda393366-kube-api-access-5dp6n\") pod \"route-controller-manager-79686b7648-l4j26\" (UID: \"a58b8632-0501-4eac-8432-b67dda393366\") " pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.885815 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a58b8632-0501-4eac-8432-b67dda393366-serving-cert\") pod \"route-controller-manager-79686b7648-l4j26\" (UID: \"a58b8632-0501-4eac-8432-b67dda393366\") " pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.885877 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a58b8632-0501-4eac-8432-b67dda393366-config\") pod \"route-controller-manager-79686b7648-l4j26\" (UID: \"a58b8632-0501-4eac-8432-b67dda393366\") " pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.886430 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a58b8632-0501-4eac-8432-b67dda393366-client-ca\") pod \"route-controller-manager-79686b7648-l4j26\" (UID: \"a58b8632-0501-4eac-8432-b67dda393366\") " pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.887010 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a58b8632-0501-4eac-8432-b67dda393366-config\") pod \"route-controller-manager-79686b7648-l4j26\" (UID: \"a58b8632-0501-4eac-8432-b67dda393366\") " pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.890222 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a58b8632-0501-4eac-8432-b67dda393366-serving-cert\") pod \"route-controller-manager-79686b7648-l4j26\" (UID: \"a58b8632-0501-4eac-8432-b67dda393366\") " pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.902957 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dp6n\" (UniqueName: \"kubernetes.io/projected/a58b8632-0501-4eac-8432-b67dda393366-kube-api-access-5dp6n\") pod \"route-controller-manager-79686b7648-l4j26\" (UID: \"a58b8632-0501-4eac-8432-b67dda393366\") " pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" Feb 26 08:48:36 crc kubenswrapper[4925]: I0226 08:48:36.919228 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" Feb 26 08:48:37 crc kubenswrapper[4925]: I0226 08:48:37.381158 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" event={"ID":"0f6c8eec-e24c-43c9-a0d5-a7e9543d456c","Type":"ContainerDied","Data":"e1dc7de75641193db05f422a5de087d81591725dfac6f1abbab447569af9dfc4"} Feb 26 08:48:37 crc kubenswrapper[4925]: I0226 08:48:37.381235 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp" Feb 26 08:48:37 crc kubenswrapper[4925]: I0226 08:48:37.410602 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp"] Feb 26 08:48:37 crc kubenswrapper[4925]: I0226 08:48:37.413918 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f574cc6bf-52qpp"] Feb 26 08:48:38 crc kubenswrapper[4925]: I0226 08:48:38.539594 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f6c8eec-e24c-43c9-a0d5-a7e9543d456c" path="/var/lib/kubelet/pods/0f6c8eec-e24c-43c9-a0d5-a7e9543d456c/volumes" Feb 26 08:48:38 crc kubenswrapper[4925]: I0226 08:48:38.540441 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 26 08:48:38 crc kubenswrapper[4925]: I0226 08:48:38.541203 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 08:48:38 crc kubenswrapper[4925]: I0226 08:48:38.546102 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 26 08:48:38 crc kubenswrapper[4925]: I0226 08:48:38.546391 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 26 08:48:38 crc kubenswrapper[4925]: I0226 08:48:38.546433 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 26 08:48:38 crc kubenswrapper[4925]: E0226 08:48:38.564024 4925 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 26 08:48:38 crc kubenswrapper[4925]: E0226 08:48:38.564431 4925 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j6tb5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-x9xkw_openshift-marketplace(5a8eea64-29d5-46bc-89db-23ad451eba4f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 08:48:38 crc kubenswrapper[4925]: E0226 08:48:38.566755 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-x9xkw" podUID="5a8eea64-29d5-46bc-89db-23ad451eba4f" Feb 26 08:48:38 crc kubenswrapper[4925]: I0226 08:48:38.708590 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dda7988d-56f1-4f44-b373-650834d33593-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dda7988d-56f1-4f44-b373-650834d33593\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 08:48:38 crc kubenswrapper[4925]: I0226 08:48:38.708681 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dda7988d-56f1-4f44-b373-650834d33593-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dda7988d-56f1-4f44-b373-650834d33593\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 08:48:38 crc kubenswrapper[4925]: I0226 08:48:38.810383 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dda7988d-56f1-4f44-b373-650834d33593-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dda7988d-56f1-4f44-b373-650834d33593\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 08:48:38 crc kubenswrapper[4925]: I0226 08:48:38.810470 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dda7988d-56f1-4f44-b373-650834d33593-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dda7988d-56f1-4f44-b373-650834d33593\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 08:48:38 crc kubenswrapper[4925]: I0226 08:48:38.810589 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dda7988d-56f1-4f44-b373-650834d33593-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"dda7988d-56f1-4f44-b373-650834d33593\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 08:48:38 crc kubenswrapper[4925]: I0226 08:48:38.835861 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dda7988d-56f1-4f44-b373-650834d33593-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"dda7988d-56f1-4f44-b373-650834d33593\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 08:48:38 crc kubenswrapper[4925]: I0226 08:48:38.874680 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 08:48:41 crc kubenswrapper[4925]: I0226 08:48:41.748452 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56df47f7bc-k4ckp"] Feb 26 08:48:41 crc kubenswrapper[4925]: I0226 08:48:41.834273 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26"] Feb 26 08:48:42 crc kubenswrapper[4925]: I0226 08:48:42.919950 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 26 08:48:42 crc kubenswrapper[4925]: I0226 08:48:42.921241 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 08:48:42 crc kubenswrapper[4925]: I0226 08:48:42.924635 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 26 08:48:43 crc kubenswrapper[4925]: I0226 08:48:43.118844 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a11f03c1-788a-4e80-aaf8-02dd3a150814-var-lock\") pod \"installer-9-crc\" (UID: \"a11f03c1-788a-4e80-aaf8-02dd3a150814\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 08:48:43 crc kubenswrapper[4925]: I0226 08:48:43.118913 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a11f03c1-788a-4e80-aaf8-02dd3a150814-kube-api-access\") pod \"installer-9-crc\" (UID: \"a11f03c1-788a-4e80-aaf8-02dd3a150814\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 08:48:43 crc kubenswrapper[4925]: I0226 08:48:43.118957 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a11f03c1-788a-4e80-aaf8-02dd3a150814-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a11f03c1-788a-4e80-aaf8-02dd3a150814\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 08:48:43 crc kubenswrapper[4925]: I0226 08:48:43.220158 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a11f03c1-788a-4e80-aaf8-02dd3a150814-var-lock\") pod \"installer-9-crc\" (UID: \"a11f03c1-788a-4e80-aaf8-02dd3a150814\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 08:48:43 crc kubenswrapper[4925]: I0226 08:48:43.220252 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a11f03c1-788a-4e80-aaf8-02dd3a150814-kube-api-access\") pod \"installer-9-crc\" (UID: \"a11f03c1-788a-4e80-aaf8-02dd3a150814\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 08:48:43 crc kubenswrapper[4925]: I0226 08:48:43.220259 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a11f03c1-788a-4e80-aaf8-02dd3a150814-var-lock\") pod \"installer-9-crc\" (UID: \"a11f03c1-788a-4e80-aaf8-02dd3a150814\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 08:48:43 crc kubenswrapper[4925]: I0226 08:48:43.220332 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a11f03c1-788a-4e80-aaf8-02dd3a150814-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a11f03c1-788a-4e80-aaf8-02dd3a150814\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 08:48:43 crc kubenswrapper[4925]: I0226 08:48:43.220408 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a11f03c1-788a-4e80-aaf8-02dd3a150814-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a11f03c1-788a-4e80-aaf8-02dd3a150814\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 08:48:43 crc kubenswrapper[4925]: I0226 08:48:43.238863 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a11f03c1-788a-4e80-aaf8-02dd3a150814-kube-api-access\") pod \"installer-9-crc\" (UID: \"a11f03c1-788a-4e80-aaf8-02dd3a150814\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 08:48:43 crc kubenswrapper[4925]: I0226 08:48:43.247724 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 08:48:45 crc kubenswrapper[4925]: E0226 08:48:45.230808 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-x9xkw" podUID="5a8eea64-29d5-46bc-89db-23ad451eba4f" Feb 26 08:48:45 crc kubenswrapper[4925]: E0226 08:48:45.408073 4925 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 26 08:48:45 crc kubenswrapper[4925]: E0226 08:48:45.408444 4925 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n72qr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-c8g65_openshift-marketplace(cf852adc-ded1-43ba-9ed4-cf31d6eda903): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 08:48:45 crc kubenswrapper[4925]: E0226 08:48:45.410262 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-c8g65" podUID="cf852adc-ded1-43ba-9ed4-cf31d6eda903" Feb 26 08:48:45 crc kubenswrapper[4925]: I0226 08:48:45.872343 4925 patch_prober.go:28] interesting pod/machine-config-daemon-s6pcl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 08:48:45 crc kubenswrapper[4925]: I0226 08:48:45.872399 4925 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" podUID="52bdf0f6-9afd-42fe-9a30-7da82bc7f619" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 08:48:46 crc kubenswrapper[4925]: E0226 08:48:46.841232 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-c8g65" podUID="cf852adc-ded1-43ba-9ed4-cf31d6eda903" Feb 26 08:48:46 crc kubenswrapper[4925]: E0226 08:48:46.910542 4925 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 26 08:48:46 crc kubenswrapper[4925]: E0226 08:48:46.910803 4925 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-czrnm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-xgq9g_openshift-marketplace(b8aa32c8-5b82-46ac-a7e2-7d6399474aff): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 08:48:46 crc kubenswrapper[4925]: E0226 08:48:46.912052 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-xgq9g" podUID="b8aa32c8-5b82-46ac-a7e2-7d6399474aff" Feb 26 08:48:48 crc kubenswrapper[4925]: E0226 08:48:48.390665 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xgq9g" podUID="b8aa32c8-5b82-46ac-a7e2-7d6399474aff" Feb 26 08:48:48 crc kubenswrapper[4925]: I0226 08:48:48.405021 4925 scope.go:117] "RemoveContainer" containerID="fa3e04c36b043e4e73b0eb4e82809b50edaf8a5f032f85df8a60aae6c89653a4" Feb 26 08:48:48 crc kubenswrapper[4925]: E0226 08:48:48.869012 4925 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 26 08:48:48 crc kubenswrapper[4925]: E0226 08:48:48.869165 4925 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n2h9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-7pkxc_openshift-marketplace(e99e124b-4917-49df-8c28-04373cfed9d3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 08:48:48 crc kubenswrapper[4925]: E0226 08:48:48.870445 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-7pkxc" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" Feb 26 08:48:49 crc kubenswrapper[4925]: E0226 08:48:49.557462 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-7pkxc" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" Feb 26 08:48:49 crc kubenswrapper[4925]: E0226 08:48:49.817724 4925 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 26 08:48:49 crc kubenswrapper[4925]: E0226 08:48:49.818298 4925 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nff7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mrq8g_openshift-marketplace(ce17c01b-f0a9-4f82-9512-028c8f673f85): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 08:48:49 crc kubenswrapper[4925]: E0226 08:48:49.819518 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-mrq8g" podUID="ce17c01b-f0a9-4f82-9512-028c8f673f85" Feb 26 08:48:49 crc kubenswrapper[4925]: E0226 08:48:49.909467 4925 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 26 08:48:49 crc kubenswrapper[4925]: E0226 08:48:49.909645 4925 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cj2zh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-gdzrh_openshift-marketplace(78d95f85-01b1-4a6b-8c48-b477f840035d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 08:48:49 crc kubenswrapper[4925]: E0226 08:48:49.911552 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-gdzrh" podUID="78d95f85-01b1-4a6b-8c48-b477f840035d" Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.015403 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56df47f7bc-k4ckp"] Feb 26 08:48:50 crc kubenswrapper[4925]: W0226 08:48:50.050929 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc30177a7_9445_4e51_94c7_b1a86984ad4b.slice/crio-81e2a6ca9e6b4e64ead00c45fd1c935d7f5deb0b0b9ced834228a6372ac921ca WatchSource:0}: Error finding container 81e2a6ca9e6b4e64ead00c45fd1c935d7f5deb0b0b9ced834228a6372ac921ca: Status 404 returned error can't find the container with id 81e2a6ca9e6b4e64ead00c45fd1c935d7f5deb0b0b9ced834228a6372ac921ca Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.134283 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 26 08:48:50 crc kubenswrapper[4925]: W0226 08:48:50.149596 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda11f03c1_788a_4e80_aaf8_02dd3a150814.slice/crio-09b5e059ece7f456508de29920466d133bd4de0e37ddec0ee3fa4ee203320e66 WatchSource:0}: Error finding container 09b5e059ece7f456508de29920466d133bd4de0e37ddec0ee3fa4ee203320e66: Status 404 returned error can't find the container with id 09b5e059ece7f456508de29920466d133bd4de0e37ddec0ee3fa4ee203320e66 Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.220205 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26"] Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.225932 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 26 08:48:50 crc kubenswrapper[4925]: W0226 08:48:50.228934 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda58b8632_0501_4eac_8432_b67dda393366.slice/crio-1141c00e1972f26b9a07f952996273a8de6faeac56fec19dbe9c40493aabbd56 WatchSource:0}: Error finding container 1141c00e1972f26b9a07f952996273a8de6faeac56fec19dbe9c40493aabbd56: Status 404 returned error can't find the container with id 1141c00e1972f26b9a07f952996273a8de6faeac56fec19dbe9c40493aabbd56 Feb 26 08:48:50 crc kubenswrapper[4925]: W0226 08:48:50.245070 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddda7988d_56f1_4f44_b373_650834d33593.slice/crio-67d18f72123a2d175eeff1cbb4093881e3df9295a8d8bf665e729a86b16a6cc5 WatchSource:0}: Error finding container 67d18f72123a2d175eeff1cbb4093881e3df9295a8d8bf665e729a86b16a6cc5: Status 404 returned error can't find the container with id 67d18f72123a2d175eeff1cbb4093881e3df9295a8d8bf665e729a86b16a6cc5 Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.489492 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534928-pxjft" event={"ID":"b2a81774-e811-40f9-a623-c50a6bcd5d7b","Type":"ContainerStarted","Data":"b9db05ef4ecc6cb32dd117435afee55e2a6325ea6e1b0cb07e94963b825794ee"} Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.496468 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a11f03c1-788a-4e80-aaf8-02dd3a150814","Type":"ContainerStarted","Data":"09b5e059ece7f456508de29920466d133bd4de0e37ddec0ee3fa4ee203320e66"} Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.500728 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" event={"ID":"c30177a7-9445-4e51-94c7-b1a86984ad4b","Type":"ContainerStarted","Data":"e65f3e31224446dbcfbc427029dd2289848caced8cb460701c5522cfaf7b0955"} Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.500774 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" event={"ID":"c30177a7-9445-4e51-94c7-b1a86984ad4b","Type":"ContainerStarted","Data":"81e2a6ca9e6b4e64ead00c45fd1c935d7f5deb0b0b9ced834228a6372ac921ca"} Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.500802 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" podUID="c30177a7-9445-4e51-94c7-b1a86984ad4b" containerName="controller-manager" containerID="cri-o://e65f3e31224446dbcfbc427029dd2289848caced8cb460701c5522cfaf7b0955" gracePeriod=30 Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.500872 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.504306 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs499" event={"ID":"75ccca07-d75b-4817-b203-931dc51638f4","Type":"ContainerStarted","Data":"5f1d6cfdde00966b8efd39d6f328135037a27eac453f379e4a94a477343b99cc"} Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.522721 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29534928-pxjft" podStartSLOduration=2.727587738 podStartE2EDuration="50.522695212s" podCreationTimestamp="2026-02-26 08:48:00 +0000 UTC" firstStartedPulling="2026-02-26 08:48:01.921105886 +0000 UTC m=+214.060680169" lastFinishedPulling="2026-02-26 08:48:49.71621336 +0000 UTC m=+261.855787643" observedRunningTime="2026-02-26 08:48:50.517660618 +0000 UTC m=+262.657234911" watchObservedRunningTime="2026-02-26 08:48:50.522695212 +0000 UTC m=+262.662269495" Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.544562 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" podStartSLOduration=29.544513277 podStartE2EDuration="29.544513277s" podCreationTimestamp="2026-02-26 08:48:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:50.544286591 +0000 UTC m=+262.683860874" watchObservedRunningTime="2026-02-26 08:48:50.544513277 +0000 UTC m=+262.684087570" Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.544904 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"dda7988d-56f1-4f44-b373-650834d33593","Type":"ContainerStarted","Data":"67d18f72123a2d175eeff1cbb4093881e3df9295a8d8bf665e729a86b16a6cc5"} Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.544986 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.547696 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" event={"ID":"a58b8632-0501-4eac-8432-b67dda393366","Type":"ContainerStarted","Data":"f1168a996a8fc1428efc9f827cbb42587991962e3ea71dae63466c78b19316b9"} Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.547740 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" event={"ID":"a58b8632-0501-4eac-8432-b67dda393366","Type":"ContainerStarted","Data":"1141c00e1972f26b9a07f952996273a8de6faeac56fec19dbe9c40493aabbd56"} Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.547927 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" podUID="a58b8632-0501-4eac-8432-b67dda393366" containerName="route-controller-manager" containerID="cri-o://f1168a996a8fc1428efc9f827cbb42587991962e3ea71dae63466c78b19316b9" gracePeriod=30 Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.548310 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.549340 4925 patch_prober.go:28] interesting pod/route-controller-manager-79686b7648-l4j26 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.549378 4925 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" podUID="a58b8632-0501-4eac-8432-b67dda393366" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.550277 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cnjt4" event={"ID":"89b3dc84-3c4c-44fd-9cd3-1da9c62753e7","Type":"ContainerStarted","Data":"9b62e6677a2e4e8bae0144473b485814a7236d35b70441f35a1d5408db8db608"} Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.553670 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534926-k666h" event={"ID":"0e938a2d-afc6-4360-aa50-e93c71d2c8c3","Type":"ContainerStarted","Data":"f673412f2df1db5308a26c13aaeb1004d0994fcbfd0f318b95e23c3cb6a2a3d6"} Feb 26 08:48:50 crc kubenswrapper[4925]: E0226 08:48:50.554473 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-gdzrh" podUID="78d95f85-01b1-4a6b-8c48-b477f840035d" Feb 26 08:48:50 crc kubenswrapper[4925]: E0226 08:48:50.563046 4925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-mrq8g" podUID="ce17c01b-f0a9-4f82-9512-028c8f673f85" Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.640832 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" podStartSLOduration=29.640809511 podStartE2EDuration="29.640809511s" podCreationTimestamp="2026-02-26 08:48:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:50.636862284 +0000 UTC m=+262.776436577" watchObservedRunningTime="2026-02-26 08:48:50.640809511 +0000 UTC m=+262.780383794" Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.675897 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29534926-k666h" podStartSLOduration=118.411413264 podStartE2EDuration="2m50.675877532s" podCreationTimestamp="2026-02-26 08:46:00 +0000 UTC" firstStartedPulling="2026-02-26 08:47:57.482308778 +0000 UTC m=+209.621883061" lastFinishedPulling="2026-02-26 08:48:49.746773046 +0000 UTC m=+261.886347329" observedRunningTime="2026-02-26 08:48:50.674271892 +0000 UTC m=+262.813846185" watchObservedRunningTime="2026-02-26 08:48:50.675877532 +0000 UTC m=+262.815451815" Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.842998 4925 csr.go:261] certificate signing request csr-gxm8r is approved, waiting to be issued Feb 26 08:48:50 crc kubenswrapper[4925]: I0226 08:48:50.850861 4925 csr.go:257] certificate signing request csr-gxm8r is issued Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.316195 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-79686b7648-l4j26_a58b8632-0501-4eac-8432-b67dda393366/route-controller-manager/0.log" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.316265 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.359405 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs"] Feb 26 08:48:51 crc kubenswrapper[4925]: E0226 08:48:51.359755 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a58b8632-0501-4eac-8432-b67dda393366" containerName="route-controller-manager" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.359776 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="a58b8632-0501-4eac-8432-b67dda393366" containerName="route-controller-manager" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.359917 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="a58b8632-0501-4eac-8432-b67dda393366" containerName="route-controller-manager" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.360429 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.362436 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs"] Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.375328 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a58b8632-0501-4eac-8432-b67dda393366-serving-cert\") pod \"a58b8632-0501-4eac-8432-b67dda393366\" (UID: \"a58b8632-0501-4eac-8432-b67dda393366\") " Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.375371 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a58b8632-0501-4eac-8432-b67dda393366-config\") pod \"a58b8632-0501-4eac-8432-b67dda393366\" (UID: \"a58b8632-0501-4eac-8432-b67dda393366\") " Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.375440 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dp6n\" (UniqueName: \"kubernetes.io/projected/a58b8632-0501-4eac-8432-b67dda393366-kube-api-access-5dp6n\") pod \"a58b8632-0501-4eac-8432-b67dda393366\" (UID: \"a58b8632-0501-4eac-8432-b67dda393366\") " Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.375492 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a58b8632-0501-4eac-8432-b67dda393366-client-ca\") pod \"a58b8632-0501-4eac-8432-b67dda393366\" (UID: \"a58b8632-0501-4eac-8432-b67dda393366\") " Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.375736 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-serving-cert\") pod \"route-controller-manager-f46886d5b-658cs\" (UID: \"32b7b48a-85e2-48f8-ac03-9d4fe0682c95\") " pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.375786 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-client-ca\") pod \"route-controller-manager-f46886d5b-658cs\" (UID: \"32b7b48a-85e2-48f8-ac03-9d4fe0682c95\") " pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.375813 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-config\") pod \"route-controller-manager-f46886d5b-658cs\" (UID: \"32b7b48a-85e2-48f8-ac03-9d4fe0682c95\") " pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.375875 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfjjv\" (UniqueName: \"kubernetes.io/projected/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-kube-api-access-qfjjv\") pod \"route-controller-manager-f46886d5b-658cs\" (UID: \"32b7b48a-85e2-48f8-ac03-9d4fe0682c95\") " pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.377564 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a58b8632-0501-4eac-8432-b67dda393366-client-ca" (OuterVolumeSpecName: "client-ca") pod "a58b8632-0501-4eac-8432-b67dda393366" (UID: "a58b8632-0501-4eac-8432-b67dda393366"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.378053 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a58b8632-0501-4eac-8432-b67dda393366-config" (OuterVolumeSpecName: "config") pod "a58b8632-0501-4eac-8432-b67dda393366" (UID: "a58b8632-0501-4eac-8432-b67dda393366"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.385082 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a58b8632-0501-4eac-8432-b67dda393366-kube-api-access-5dp6n" (OuterVolumeSpecName: "kube-api-access-5dp6n") pod "a58b8632-0501-4eac-8432-b67dda393366" (UID: "a58b8632-0501-4eac-8432-b67dda393366"). InnerVolumeSpecName "kube-api-access-5dp6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.385133 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a58b8632-0501-4eac-8432-b67dda393366-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a58b8632-0501-4eac-8432-b67dda393366" (UID: "a58b8632-0501-4eac-8432-b67dda393366"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.476778 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-serving-cert\") pod \"route-controller-manager-f46886d5b-658cs\" (UID: \"32b7b48a-85e2-48f8-ac03-9d4fe0682c95\") " pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.476839 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-client-ca\") pod \"route-controller-manager-f46886d5b-658cs\" (UID: \"32b7b48a-85e2-48f8-ac03-9d4fe0682c95\") " pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.477859 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-config\") pod \"route-controller-manager-f46886d5b-658cs\" (UID: \"32b7b48a-85e2-48f8-ac03-9d4fe0682c95\") " pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.477924 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfjjv\" (UniqueName: \"kubernetes.io/projected/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-kube-api-access-qfjjv\") pod \"route-controller-manager-f46886d5b-658cs\" (UID: \"32b7b48a-85e2-48f8-ac03-9d4fe0682c95\") " pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.477960 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dp6n\" (UniqueName: \"kubernetes.io/projected/a58b8632-0501-4eac-8432-b67dda393366-kube-api-access-5dp6n\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.477970 4925 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a58b8632-0501-4eac-8432-b67dda393366-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.477978 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a58b8632-0501-4eac-8432-b67dda393366-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.477987 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a58b8632-0501-4eac-8432-b67dda393366-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.477801 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-client-ca\") pod \"route-controller-manager-f46886d5b-658cs\" (UID: \"32b7b48a-85e2-48f8-ac03-9d4fe0682c95\") " pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.478984 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-config\") pod \"route-controller-manager-f46886d5b-658cs\" (UID: \"32b7b48a-85e2-48f8-ac03-9d4fe0682c95\") " pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.481469 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-serving-cert\") pod \"route-controller-manager-f46886d5b-658cs\" (UID: \"32b7b48a-85e2-48f8-ac03-9d4fe0682c95\") " pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.502729 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfjjv\" (UniqueName: \"kubernetes.io/projected/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-kube-api-access-qfjjv\") pod \"route-controller-manager-f46886d5b-658cs\" (UID: \"32b7b48a-85e2-48f8-ac03-9d4fe0682c95\") " pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.563740 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a11f03c1-788a-4e80-aaf8-02dd3a150814","Type":"ContainerStarted","Data":"af642ad749fa65a4f2db241aaa17dfb27beb28397b85424ebf4848346715849b"} Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.567149 4925 generic.go:334] "Generic (PLEG): container finished" podID="c30177a7-9445-4e51-94c7-b1a86984ad4b" containerID="e65f3e31224446dbcfbc427029dd2289848caced8cb460701c5522cfaf7b0955" exitCode=0 Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.567213 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" event={"ID":"c30177a7-9445-4e51-94c7-b1a86984ad4b","Type":"ContainerDied","Data":"e65f3e31224446dbcfbc427029dd2289848caced8cb460701c5522cfaf7b0955"} Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.575114 4925 generic.go:334] "Generic (PLEG): container finished" podID="75ccca07-d75b-4817-b203-931dc51638f4" containerID="5f1d6cfdde00966b8efd39d6f328135037a27eac453f379e4a94a477343b99cc" exitCode=0 Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.575764 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs499" event={"ID":"75ccca07-d75b-4817-b203-931dc51638f4","Type":"ContainerDied","Data":"5f1d6cfdde00966b8efd39d6f328135037a27eac453f379e4a94a477343b99cc"} Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.581613 4925 generic.go:334] "Generic (PLEG): container finished" podID="b2a81774-e811-40f9-a623-c50a6bcd5d7b" containerID="b9db05ef4ecc6cb32dd117435afee55e2a6325ea6e1b0cb07e94963b825794ee" exitCode=0 Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.581772 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534928-pxjft" event={"ID":"b2a81774-e811-40f9-a623-c50a6bcd5d7b","Type":"ContainerDied","Data":"b9db05ef4ecc6cb32dd117435afee55e2a6325ea6e1b0cb07e94963b825794ee"} Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.584328 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"dda7988d-56f1-4f44-b373-650834d33593","Type":"ContainerStarted","Data":"fdcffd29d7eb096d3da222688e43ae87cbc6ca45035335579ec8bf53fe92f5c2"} Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.584864 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=9.584847624 podStartE2EDuration="9.584847624s" podCreationTimestamp="2026-02-26 08:48:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:51.579341679 +0000 UTC m=+263.718915972" watchObservedRunningTime="2026-02-26 08:48:51.584847624 +0000 UTC m=+263.724421907" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.587120 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-79686b7648-l4j26_a58b8632-0501-4eac-8432-b67dda393366/route-controller-manager/0.log" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.587159 4925 generic.go:334] "Generic (PLEG): container finished" podID="a58b8632-0501-4eac-8432-b67dda393366" containerID="f1168a996a8fc1428efc9f827cbb42587991962e3ea71dae63466c78b19316b9" exitCode=2 Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.587216 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" event={"ID":"a58b8632-0501-4eac-8432-b67dda393366","Type":"ContainerDied","Data":"f1168a996a8fc1428efc9f827cbb42587991962e3ea71dae63466c78b19316b9"} Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.587264 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" event={"ID":"a58b8632-0501-4eac-8432-b67dda393366","Type":"ContainerDied","Data":"1141c00e1972f26b9a07f952996273a8de6faeac56fec19dbe9c40493aabbd56"} Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.587281 4925 scope.go:117] "RemoveContainer" containerID="f1168a996a8fc1428efc9f827cbb42587991962e3ea71dae63466c78b19316b9" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.587262 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.598608 4925 generic.go:334] "Generic (PLEG): container finished" podID="89b3dc84-3c4c-44fd-9cd3-1da9c62753e7" containerID="9b62e6677a2e4e8bae0144473b485814a7236d35b70441f35a1d5408db8db608" exitCode=0 Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.598664 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cnjt4" event={"ID":"89b3dc84-3c4c-44fd-9cd3-1da9c62753e7","Type":"ContainerDied","Data":"9b62e6677a2e4e8bae0144473b485814a7236d35b70441f35a1d5408db8db608"} Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.606640 4925 generic.go:334] "Generic (PLEG): container finished" podID="0e938a2d-afc6-4360-aa50-e93c71d2c8c3" containerID="f673412f2df1db5308a26c13aaeb1004d0994fcbfd0f318b95e23c3cb6a2a3d6" exitCode=0 Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.606698 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534926-k666h" event={"ID":"0e938a2d-afc6-4360-aa50-e93c71d2c8c3","Type":"ContainerDied","Data":"f673412f2df1db5308a26c13aaeb1004d0994fcbfd0f318b95e23c3cb6a2a3d6"} Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.644410 4925 scope.go:117] "RemoveContainer" containerID="f1168a996a8fc1428efc9f827cbb42587991962e3ea71dae63466c78b19316b9" Feb 26 08:48:51 crc kubenswrapper[4925]: E0226 08:48:51.650187 4925 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1168a996a8fc1428efc9f827cbb42587991962e3ea71dae63466c78b19316b9\": container with ID starting with f1168a996a8fc1428efc9f827cbb42587991962e3ea71dae63466c78b19316b9 not found: ID does not exist" containerID="f1168a996a8fc1428efc9f827cbb42587991962e3ea71dae63466c78b19316b9" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.650236 4925 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1168a996a8fc1428efc9f827cbb42587991962e3ea71dae63466c78b19316b9"} err="failed to get container status \"f1168a996a8fc1428efc9f827cbb42587991962e3ea71dae63466c78b19316b9\": rpc error: code = NotFound desc = could not find container \"f1168a996a8fc1428efc9f827cbb42587991962e3ea71dae63466c78b19316b9\": container with ID starting with f1168a996a8fc1428efc9f827cbb42587991962e3ea71dae63466c78b19316b9 not found: ID does not exist" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.663442 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26"] Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.667971 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79686b7648-l4j26"] Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.673342 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.715148 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.784318 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c30177a7-9445-4e51-94c7-b1a86984ad4b-client-ca\") pod \"c30177a7-9445-4e51-94c7-b1a86984ad4b\" (UID: \"c30177a7-9445-4e51-94c7-b1a86984ad4b\") " Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.784406 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hpxl\" (UniqueName: \"kubernetes.io/projected/c30177a7-9445-4e51-94c7-b1a86984ad4b-kube-api-access-2hpxl\") pod \"c30177a7-9445-4e51-94c7-b1a86984ad4b\" (UID: \"c30177a7-9445-4e51-94c7-b1a86984ad4b\") " Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.784491 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c30177a7-9445-4e51-94c7-b1a86984ad4b-proxy-ca-bundles\") pod \"c30177a7-9445-4e51-94c7-b1a86984ad4b\" (UID: \"c30177a7-9445-4e51-94c7-b1a86984ad4b\") " Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.784583 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c30177a7-9445-4e51-94c7-b1a86984ad4b-serving-cert\") pod \"c30177a7-9445-4e51-94c7-b1a86984ad4b\" (UID: \"c30177a7-9445-4e51-94c7-b1a86984ad4b\") " Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.784620 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c30177a7-9445-4e51-94c7-b1a86984ad4b-config\") pod \"c30177a7-9445-4e51-94c7-b1a86984ad4b\" (UID: \"c30177a7-9445-4e51-94c7-b1a86984ad4b\") " Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.785823 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c30177a7-9445-4e51-94c7-b1a86984ad4b-config" (OuterVolumeSpecName: "config") pod "c30177a7-9445-4e51-94c7-b1a86984ad4b" (UID: "c30177a7-9445-4e51-94c7-b1a86984ad4b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.786359 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c30177a7-9445-4e51-94c7-b1a86984ad4b-client-ca" (OuterVolumeSpecName: "client-ca") pod "c30177a7-9445-4e51-94c7-b1a86984ad4b" (UID: "c30177a7-9445-4e51-94c7-b1a86984ad4b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.786864 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c30177a7-9445-4e51-94c7-b1a86984ad4b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c30177a7-9445-4e51-94c7-b1a86984ad4b" (UID: "c30177a7-9445-4e51-94c7-b1a86984ad4b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.791147 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c30177a7-9445-4e51-94c7-b1a86984ad4b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c30177a7-9445-4e51-94c7-b1a86984ad4b" (UID: "c30177a7-9445-4e51-94c7-b1a86984ad4b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.791550 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c30177a7-9445-4e51-94c7-b1a86984ad4b-kube-api-access-2hpxl" (OuterVolumeSpecName: "kube-api-access-2hpxl") pod "c30177a7-9445-4e51-94c7-b1a86984ad4b" (UID: "c30177a7-9445-4e51-94c7-b1a86984ad4b"). InnerVolumeSpecName "kube-api-access-2hpxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.852910 4925 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-10 10:20:59.529320207 +0000 UTC Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.852956 4925 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7633h32m7.676370781s for next certificate rotation Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.886560 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c30177a7-9445-4e51-94c7-b1a86984ad4b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.886597 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c30177a7-9445-4e51-94c7-b1a86984ad4b-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.886608 4925 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c30177a7-9445-4e51-94c7-b1a86984ad4b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.886620 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hpxl\" (UniqueName: \"kubernetes.io/projected/c30177a7-9445-4e51-94c7-b1a86984ad4b-kube-api-access-2hpxl\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.886633 4925 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c30177a7-9445-4e51-94c7-b1a86984ad4b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:51 crc kubenswrapper[4925]: I0226 08:48:51.887013 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs"] Feb 26 08:48:51 crc kubenswrapper[4925]: W0226 08:48:51.893119 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32b7b48a_85e2_48f8_ac03_9d4fe0682c95.slice/crio-471d8689c7ae54b8d0aedff0ecdf29696ea8e9e1061bb1c76c8d73e6d127bea5 WatchSource:0}: Error finding container 471d8689c7ae54b8d0aedff0ecdf29696ea8e9e1061bb1c76c8d73e6d127bea5: Status 404 returned error can't find the container with id 471d8689c7ae54b8d0aedff0ecdf29696ea8e9e1061bb1c76c8d73e6d127bea5 Feb 26 08:48:52 crc kubenswrapper[4925]: I0226 08:48:52.521343 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a58b8632-0501-4eac-8432-b67dda393366" path="/var/lib/kubelet/pods/a58b8632-0501-4eac-8432-b67dda393366/volumes" Feb 26 08:48:52 crc kubenswrapper[4925]: I0226 08:48:52.616439 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" event={"ID":"32b7b48a-85e2-48f8-ac03-9d4fe0682c95","Type":"ContainerStarted","Data":"813f99b749ca6ab824da1f1cec6f68814e9669b12c7b47aa21de0db2bb856fc6"} Feb 26 08:48:52 crc kubenswrapper[4925]: I0226 08:48:52.616489 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" event={"ID":"32b7b48a-85e2-48f8-ac03-9d4fe0682c95","Type":"ContainerStarted","Data":"471d8689c7ae54b8d0aedff0ecdf29696ea8e9e1061bb1c76c8d73e6d127bea5"} Feb 26 08:48:52 crc kubenswrapper[4925]: I0226 08:48:52.617390 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" Feb 26 08:48:52 crc kubenswrapper[4925]: I0226 08:48:52.619206 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" event={"ID":"c30177a7-9445-4e51-94c7-b1a86984ad4b","Type":"ContainerDied","Data":"81e2a6ca9e6b4e64ead00c45fd1c935d7f5deb0b0b9ced834228a6372ac921ca"} Feb 26 08:48:52 crc kubenswrapper[4925]: I0226 08:48:52.619260 4925 scope.go:117] "RemoveContainer" containerID="e65f3e31224446dbcfbc427029dd2289848caced8cb460701c5522cfaf7b0955" Feb 26 08:48:52 crc kubenswrapper[4925]: I0226 08:48:52.619393 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56df47f7bc-k4ckp" Feb 26 08:48:52 crc kubenswrapper[4925]: I0226 08:48:52.622283 4925 generic.go:334] "Generic (PLEG): container finished" podID="dda7988d-56f1-4f44-b373-650834d33593" containerID="fdcffd29d7eb096d3da222688e43ae87cbc6ca45035335579ec8bf53fe92f5c2" exitCode=0 Feb 26 08:48:52 crc kubenswrapper[4925]: I0226 08:48:52.622383 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"dda7988d-56f1-4f44-b373-650834d33593","Type":"ContainerDied","Data":"fdcffd29d7eb096d3da222688e43ae87cbc6ca45035335579ec8bf53fe92f5c2"} Feb 26 08:48:52 crc kubenswrapper[4925]: I0226 08:48:52.625287 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" Feb 26 08:48:52 crc kubenswrapper[4925]: I0226 08:48:52.635811 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" podStartSLOduration=11.635772592 podStartE2EDuration="11.635772592s" podCreationTimestamp="2026-02-26 08:48:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:52.633766333 +0000 UTC m=+264.773340616" watchObservedRunningTime="2026-02-26 08:48:52.635772592 +0000 UTC m=+264.775346875" Feb 26 08:48:52 crc kubenswrapper[4925]: I0226 08:48:52.686404 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56df47f7bc-k4ckp"] Feb 26 08:48:52 crc kubenswrapper[4925]: I0226 08:48:52.693395 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-56df47f7bc-k4ckp"] Feb 26 08:48:52 crc kubenswrapper[4925]: I0226 08:48:52.854072 4925 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-10 23:20:39.015491321 +0000 UTC Feb 26 08:48:52 crc kubenswrapper[4925]: I0226 08:48:52.854123 4925 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6182h31m46.161371519s for next certificate rotation Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.156462 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534926-k666h" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.199885 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gg9g\" (UniqueName: \"kubernetes.io/projected/0e938a2d-afc6-4360-aa50-e93c71d2c8c3-kube-api-access-8gg9g\") pod \"0e938a2d-afc6-4360-aa50-e93c71d2c8c3\" (UID: \"0e938a2d-afc6-4360-aa50-e93c71d2c8c3\") " Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.206270 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e938a2d-afc6-4360-aa50-e93c71d2c8c3-kube-api-access-8gg9g" (OuterVolumeSpecName: "kube-api-access-8gg9g") pod "0e938a2d-afc6-4360-aa50-e93c71d2c8c3" (UID: "0e938a2d-afc6-4360-aa50-e93c71d2c8c3"). InnerVolumeSpecName "kube-api-access-8gg9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.296373 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.302251 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534928-pxjft" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.303059 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gg9g\" (UniqueName: \"kubernetes.io/projected/0e938a2d-afc6-4360-aa50-e93c71d2c8c3-kube-api-access-8gg9g\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.403749 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dda7988d-56f1-4f44-b373-650834d33593-kube-api-access\") pod \"dda7988d-56f1-4f44-b373-650834d33593\" (UID: \"dda7988d-56f1-4f44-b373-650834d33593\") " Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.403785 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dda7988d-56f1-4f44-b373-650834d33593-kubelet-dir\") pod \"dda7988d-56f1-4f44-b373-650834d33593\" (UID: \"dda7988d-56f1-4f44-b373-650834d33593\") " Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.403921 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2qjw\" (UniqueName: \"kubernetes.io/projected/b2a81774-e811-40f9-a623-c50a6bcd5d7b-kube-api-access-b2qjw\") pod \"b2a81774-e811-40f9-a623-c50a6bcd5d7b\" (UID: \"b2a81774-e811-40f9-a623-c50a6bcd5d7b\") " Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.403965 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dda7988d-56f1-4f44-b373-650834d33593-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dda7988d-56f1-4f44-b373-650834d33593" (UID: "dda7988d-56f1-4f44-b373-650834d33593"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.404189 4925 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dda7988d-56f1-4f44-b373-650834d33593-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.407083 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2a81774-e811-40f9-a623-c50a6bcd5d7b-kube-api-access-b2qjw" (OuterVolumeSpecName: "kube-api-access-b2qjw") pod "b2a81774-e811-40f9-a623-c50a6bcd5d7b" (UID: "b2a81774-e811-40f9-a623-c50a6bcd5d7b"). InnerVolumeSpecName "kube-api-access-b2qjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.407138 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dda7988d-56f1-4f44-b373-650834d33593-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dda7988d-56f1-4f44-b373-650834d33593" (UID: "dda7988d-56f1-4f44-b373-650834d33593"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.505329 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2qjw\" (UniqueName: \"kubernetes.io/projected/b2a81774-e811-40f9-a623-c50a6bcd5d7b-kube-api-access-b2qjw\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.505361 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dda7988d-56f1-4f44-b373-650834d33593-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.637868 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs499" event={"ID":"75ccca07-d75b-4817-b203-931dc51638f4","Type":"ContainerStarted","Data":"86900f507ee604438c059aa08e874c57a98d6aef7d5045e683084aac6f23b123"} Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.639581 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.639581 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"dda7988d-56f1-4f44-b373-650834d33593","Type":"ContainerDied","Data":"67d18f72123a2d175eeff1cbb4093881e3df9295a8d8bf665e729a86b16a6cc5"} Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.639691 4925 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67d18f72123a2d175eeff1cbb4093881e3df9295a8d8bf665e729a86b16a6cc5" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.641170 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534928-pxjft" event={"ID":"b2a81774-e811-40f9-a623-c50a6bcd5d7b","Type":"ContainerDied","Data":"9537c95d197a961ce5ec196a6f7fbaf435445ff318995a57c283825098fba88f"} Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.641205 4925 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9537c95d197a961ce5ec196a6f7fbaf435445ff318995a57c283825098fba88f" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.642125 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534928-pxjft" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.643364 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534926-k666h" event={"ID":"0e938a2d-afc6-4360-aa50-e93c71d2c8c3","Type":"ContainerDied","Data":"9ae2843c7e040e3cb6df50c0f7846d0086e9f87cadee98ea0d95635d01d9e87c"} Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.643395 4925 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ae2843c7e040e3cb6df50c0f7846d0086e9f87cadee98ea0d95635d01d9e87c" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.643433 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534926-k666h" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.702376 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zs499" podStartSLOduration=3.565193324 podStartE2EDuration="48.702342553s" podCreationTimestamp="2026-02-26 08:48:05 +0000 UTC" firstStartedPulling="2026-02-26 08:48:08.171868349 +0000 UTC m=+220.311442632" lastFinishedPulling="2026-02-26 08:48:53.309017578 +0000 UTC m=+265.448591861" observedRunningTime="2026-02-26 08:48:53.666087163 +0000 UTC m=+265.805661466" watchObservedRunningTime="2026-02-26 08:48:53.702342553 +0000 UTC m=+265.841916856" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.879917 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7ccbd9765b-nskfs"] Feb 26 08:48:53 crc kubenswrapper[4925]: E0226 08:48:53.880144 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2a81774-e811-40f9-a623-c50a6bcd5d7b" containerName="oc" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.880157 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2a81774-e811-40f9-a623-c50a6bcd5d7b" containerName="oc" Feb 26 08:48:53 crc kubenswrapper[4925]: E0226 08:48:53.880170 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e938a2d-afc6-4360-aa50-e93c71d2c8c3" containerName="oc" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.880176 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e938a2d-afc6-4360-aa50-e93c71d2c8c3" containerName="oc" Feb 26 08:48:53 crc kubenswrapper[4925]: E0226 08:48:53.880185 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda7988d-56f1-4f44-b373-650834d33593" containerName="pruner" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.880193 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda7988d-56f1-4f44-b373-650834d33593" containerName="pruner" Feb 26 08:48:53 crc kubenswrapper[4925]: E0226 08:48:53.880209 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c30177a7-9445-4e51-94c7-b1a86984ad4b" containerName="controller-manager" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.880216 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30177a7-9445-4e51-94c7-b1a86984ad4b" containerName="controller-manager" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.880322 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e938a2d-afc6-4360-aa50-e93c71d2c8c3" containerName="oc" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.880337 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="dda7988d-56f1-4f44-b373-650834d33593" containerName="pruner" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.880346 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2a81774-e811-40f9-a623-c50a6bcd5d7b" containerName="oc" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.880354 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="c30177a7-9445-4e51-94c7-b1a86984ad4b" containerName="controller-manager" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.880720 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.884309 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.884930 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.885374 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.885670 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.885781 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.888229 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.889369 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.891437 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7ccbd9765b-nskfs"] Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.910003 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4335457-df11-4f31-af34-2b6a70dbf1d7-config\") pod \"controller-manager-7ccbd9765b-nskfs\" (UID: \"f4335457-df11-4f31-af34-2b6a70dbf1d7\") " pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.910074 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4335457-df11-4f31-af34-2b6a70dbf1d7-serving-cert\") pod \"controller-manager-7ccbd9765b-nskfs\" (UID: \"f4335457-df11-4f31-af34-2b6a70dbf1d7\") " pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.910130 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nfgp\" (UniqueName: \"kubernetes.io/projected/f4335457-df11-4f31-af34-2b6a70dbf1d7-kube-api-access-4nfgp\") pod \"controller-manager-7ccbd9765b-nskfs\" (UID: \"f4335457-df11-4f31-af34-2b6a70dbf1d7\") " pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.910175 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4335457-df11-4f31-af34-2b6a70dbf1d7-proxy-ca-bundles\") pod \"controller-manager-7ccbd9765b-nskfs\" (UID: \"f4335457-df11-4f31-af34-2b6a70dbf1d7\") " pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" Feb 26 08:48:53 crc kubenswrapper[4925]: I0226 08:48:53.910204 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4335457-df11-4f31-af34-2b6a70dbf1d7-client-ca\") pod \"controller-manager-7ccbd9765b-nskfs\" (UID: \"f4335457-df11-4f31-af34-2b6a70dbf1d7\") " pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" Feb 26 08:48:54 crc kubenswrapper[4925]: I0226 08:48:54.011070 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nfgp\" (UniqueName: \"kubernetes.io/projected/f4335457-df11-4f31-af34-2b6a70dbf1d7-kube-api-access-4nfgp\") pod \"controller-manager-7ccbd9765b-nskfs\" (UID: \"f4335457-df11-4f31-af34-2b6a70dbf1d7\") " pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" Feb 26 08:48:54 crc kubenswrapper[4925]: I0226 08:48:54.011136 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4335457-df11-4f31-af34-2b6a70dbf1d7-proxy-ca-bundles\") pod \"controller-manager-7ccbd9765b-nskfs\" (UID: \"f4335457-df11-4f31-af34-2b6a70dbf1d7\") " pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" Feb 26 08:48:54 crc kubenswrapper[4925]: I0226 08:48:54.011166 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4335457-df11-4f31-af34-2b6a70dbf1d7-client-ca\") pod \"controller-manager-7ccbd9765b-nskfs\" (UID: \"f4335457-df11-4f31-af34-2b6a70dbf1d7\") " pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" Feb 26 08:48:54 crc kubenswrapper[4925]: I0226 08:48:54.011212 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4335457-df11-4f31-af34-2b6a70dbf1d7-config\") pod \"controller-manager-7ccbd9765b-nskfs\" (UID: \"f4335457-df11-4f31-af34-2b6a70dbf1d7\") " pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" Feb 26 08:48:54 crc kubenswrapper[4925]: I0226 08:48:54.011246 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4335457-df11-4f31-af34-2b6a70dbf1d7-serving-cert\") pod \"controller-manager-7ccbd9765b-nskfs\" (UID: \"f4335457-df11-4f31-af34-2b6a70dbf1d7\") " pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" Feb 26 08:48:54 crc kubenswrapper[4925]: I0226 08:48:54.012641 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4335457-df11-4f31-af34-2b6a70dbf1d7-proxy-ca-bundles\") pod \"controller-manager-7ccbd9765b-nskfs\" (UID: \"f4335457-df11-4f31-af34-2b6a70dbf1d7\") " pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" Feb 26 08:48:54 crc kubenswrapper[4925]: I0226 08:48:54.012883 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4335457-df11-4f31-af34-2b6a70dbf1d7-client-ca\") pod \"controller-manager-7ccbd9765b-nskfs\" (UID: \"f4335457-df11-4f31-af34-2b6a70dbf1d7\") " pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" Feb 26 08:48:54 crc kubenswrapper[4925]: I0226 08:48:54.013046 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4335457-df11-4f31-af34-2b6a70dbf1d7-config\") pod \"controller-manager-7ccbd9765b-nskfs\" (UID: \"f4335457-df11-4f31-af34-2b6a70dbf1d7\") " pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" Feb 26 08:48:54 crc kubenswrapper[4925]: I0226 08:48:54.027149 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4335457-df11-4f31-af34-2b6a70dbf1d7-serving-cert\") pod \"controller-manager-7ccbd9765b-nskfs\" (UID: \"f4335457-df11-4f31-af34-2b6a70dbf1d7\") " pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" Feb 26 08:48:54 crc kubenswrapper[4925]: I0226 08:48:54.030412 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nfgp\" (UniqueName: \"kubernetes.io/projected/f4335457-df11-4f31-af34-2b6a70dbf1d7-kube-api-access-4nfgp\") pod \"controller-manager-7ccbd9765b-nskfs\" (UID: \"f4335457-df11-4f31-af34-2b6a70dbf1d7\") " pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" Feb 26 08:48:54 crc kubenswrapper[4925]: I0226 08:48:54.199296 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" Feb 26 08:48:54 crc kubenswrapper[4925]: I0226 08:48:54.521655 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c30177a7-9445-4e51-94c7-b1a86984ad4b" path="/var/lib/kubelet/pods/c30177a7-9445-4e51-94c7-b1a86984ad4b/volumes" Feb 26 08:48:55 crc kubenswrapper[4925]: I0226 08:48:55.479139 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7ccbd9765b-nskfs"] Feb 26 08:48:55 crc kubenswrapper[4925]: I0226 08:48:55.659262 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" event={"ID":"f4335457-df11-4f31-af34-2b6a70dbf1d7","Type":"ContainerStarted","Data":"882aaf442e02f9321e0e4639fd57381a6f00aa6137c79cc5f4caf925e9f85ca1"} Feb 26 08:48:56 crc kubenswrapper[4925]: I0226 08:48:56.086789 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zs499" Feb 26 08:48:56 crc kubenswrapper[4925]: I0226 08:48:56.087143 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zs499" Feb 26 08:48:56 crc kubenswrapper[4925]: I0226 08:48:56.669433 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cnjt4" event={"ID":"89b3dc84-3c4c-44fd-9cd3-1da9c62753e7","Type":"ContainerStarted","Data":"b77bb3d9e6b3343940942f43b73b1e2f717d9b110666577644f5b63a4a4201fe"} Feb 26 08:48:56 crc kubenswrapper[4925]: I0226 08:48:56.672402 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" event={"ID":"f4335457-df11-4f31-af34-2b6a70dbf1d7","Type":"ContainerStarted","Data":"783f1330020deef7b55a91ce0e0481b697921f12ce5499e4be9fd59b494f2915"} Feb 26 08:48:56 crc kubenswrapper[4925]: I0226 08:48:56.672706 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" Feb 26 08:48:56 crc kubenswrapper[4925]: I0226 08:48:56.679492 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" Feb 26 08:48:56 crc kubenswrapper[4925]: I0226 08:48:56.700384 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cnjt4" podStartSLOduration=4.914741311 podStartE2EDuration="51.700365286s" podCreationTimestamp="2026-02-26 08:48:05 +0000 UTC" firstStartedPulling="2026-02-26 08:48:08.17929645 +0000 UTC m=+220.318870733" lastFinishedPulling="2026-02-26 08:48:54.964920425 +0000 UTC m=+267.104494708" observedRunningTime="2026-02-26 08:48:56.695318422 +0000 UTC m=+268.834892725" watchObservedRunningTime="2026-02-26 08:48:56.700365286 +0000 UTC m=+268.839939569" Feb 26 08:48:56 crc kubenswrapper[4925]: I0226 08:48:56.721728 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" podStartSLOduration=15.72170454 podStartE2EDuration="15.72170454s" podCreationTimestamp="2026-02-26 08:48:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:48:56.718307867 +0000 UTC m=+268.857882170" watchObservedRunningTime="2026-02-26 08:48:56.72170454 +0000 UTC m=+268.861278823" Feb 26 08:48:57 crc kubenswrapper[4925]: I0226 08:48:57.541839 4925 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zs499" podUID="75ccca07-d75b-4817-b203-931dc51638f4" containerName="registry-server" probeResult="failure" output=< Feb 26 08:48:57 crc kubenswrapper[4925]: timeout: failed to connect service ":50051" within 1s Feb 26 08:48:57 crc kubenswrapper[4925]: > Feb 26 08:48:58 crc kubenswrapper[4925]: I0226 08:48:58.686180 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8g65" event={"ID":"cf852adc-ded1-43ba-9ed4-cf31d6eda903","Type":"ContainerStarted","Data":"5d4454d771c799ec5afdfc7236931be8006d5ef28b9c2f1cfea9e96b5c49e4bf"} Feb 26 08:48:59 crc kubenswrapper[4925]: I0226 08:48:59.693283 4925 generic.go:334] "Generic (PLEG): container finished" podID="cf852adc-ded1-43ba-9ed4-cf31d6eda903" containerID="5d4454d771c799ec5afdfc7236931be8006d5ef28b9c2f1cfea9e96b5c49e4bf" exitCode=0 Feb 26 08:48:59 crc kubenswrapper[4925]: I0226 08:48:59.693355 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8g65" event={"ID":"cf852adc-ded1-43ba-9ed4-cf31d6eda903","Type":"ContainerDied","Data":"5d4454d771c799ec5afdfc7236931be8006d5ef28b9c2f1cfea9e96b5c49e4bf"} Feb 26 08:49:01 crc kubenswrapper[4925]: I0226 08:49:01.745929 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7ccbd9765b-nskfs"] Feb 26 08:49:01 crc kubenswrapper[4925]: I0226 08:49:01.746483 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" podUID="f4335457-df11-4f31-af34-2b6a70dbf1d7" containerName="controller-manager" containerID="cri-o://783f1330020deef7b55a91ce0e0481b697921f12ce5499e4be9fd59b494f2915" gracePeriod=30 Feb 26 08:49:01 crc kubenswrapper[4925]: I0226 08:49:01.775796 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs"] Feb 26 08:49:01 crc kubenswrapper[4925]: I0226 08:49:01.776046 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" podUID="32b7b48a-85e2-48f8-ac03-9d4fe0682c95" containerName="route-controller-manager" containerID="cri-o://813f99b749ca6ab824da1f1cec6f68814e9669b12c7b47aa21de0db2bb856fc6" gracePeriod=30 Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.712004 4925 generic.go:334] "Generic (PLEG): container finished" podID="f4335457-df11-4f31-af34-2b6a70dbf1d7" containerID="783f1330020deef7b55a91ce0e0481b697921f12ce5499e4be9fd59b494f2915" exitCode=0 Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.712095 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" event={"ID":"f4335457-df11-4f31-af34-2b6a70dbf1d7","Type":"ContainerDied","Data":"783f1330020deef7b55a91ce0e0481b697921f12ce5499e4be9fd59b494f2915"} Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.714012 4925 generic.go:334] "Generic (PLEG): container finished" podID="32b7b48a-85e2-48f8-ac03-9d4fe0682c95" containerID="813f99b749ca6ab824da1f1cec6f68814e9669b12c7b47aa21de0db2bb856fc6" exitCode=0 Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.714035 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" event={"ID":"32b7b48a-85e2-48f8-ac03-9d4fe0682c95","Type":"ContainerDied","Data":"813f99b749ca6ab824da1f1cec6f68814e9669b12c7b47aa21de0db2bb856fc6"} Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.831114 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.858391 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm"] Feb 26 08:49:02 crc kubenswrapper[4925]: E0226 08:49:02.858693 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32b7b48a-85e2-48f8-ac03-9d4fe0682c95" containerName="route-controller-manager" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.858709 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b7b48a-85e2-48f8-ac03-9d4fe0682c95" containerName="route-controller-manager" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.858834 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="32b7b48a-85e2-48f8-ac03-9d4fe0682c95" containerName="route-controller-manager" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.859344 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.859866 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfjjv\" (UniqueName: \"kubernetes.io/projected/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-kube-api-access-qfjjv\") pod \"32b7b48a-85e2-48f8-ac03-9d4fe0682c95\" (UID: \"32b7b48a-85e2-48f8-ac03-9d4fe0682c95\") " Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.859903 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-client-ca\") pod \"32b7b48a-85e2-48f8-ac03-9d4fe0682c95\" (UID: \"32b7b48a-85e2-48f8-ac03-9d4fe0682c95\") " Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.859942 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-config\") pod \"32b7b48a-85e2-48f8-ac03-9d4fe0682c95\" (UID: \"32b7b48a-85e2-48f8-ac03-9d4fe0682c95\") " Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.859994 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-serving-cert\") pod \"32b7b48a-85e2-48f8-ac03-9d4fe0682c95\" (UID: \"32b7b48a-85e2-48f8-ac03-9d4fe0682c95\") " Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.860099 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bad9fa6d-dc45-4aef-8380-5aa98401b840-serving-cert\") pod \"route-controller-manager-5449b6d7cb-6lgdm\" (UID: \"bad9fa6d-dc45-4aef-8380-5aa98401b840\") " pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.860128 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bppnd\" (UniqueName: \"kubernetes.io/projected/bad9fa6d-dc45-4aef-8380-5aa98401b840-kube-api-access-bppnd\") pod \"route-controller-manager-5449b6d7cb-6lgdm\" (UID: \"bad9fa6d-dc45-4aef-8380-5aa98401b840\") " pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.860155 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bad9fa6d-dc45-4aef-8380-5aa98401b840-client-ca\") pod \"route-controller-manager-5449b6d7cb-6lgdm\" (UID: \"bad9fa6d-dc45-4aef-8380-5aa98401b840\") " pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.860175 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bad9fa6d-dc45-4aef-8380-5aa98401b840-config\") pod \"route-controller-manager-5449b6d7cb-6lgdm\" (UID: \"bad9fa6d-dc45-4aef-8380-5aa98401b840\") " pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.861028 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-config" (OuterVolumeSpecName: "config") pod "32b7b48a-85e2-48f8-ac03-9d4fe0682c95" (UID: "32b7b48a-85e2-48f8-ac03-9d4fe0682c95"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.861378 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-client-ca" (OuterVolumeSpecName: "client-ca") pod "32b7b48a-85e2-48f8-ac03-9d4fe0682c95" (UID: "32b7b48a-85e2-48f8-ac03-9d4fe0682c95"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.867620 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-kube-api-access-qfjjv" (OuterVolumeSpecName: "kube-api-access-qfjjv") pod "32b7b48a-85e2-48f8-ac03-9d4fe0682c95" (UID: "32b7b48a-85e2-48f8-ac03-9d4fe0682c95"). InnerVolumeSpecName "kube-api-access-qfjjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.867726 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm"] Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.868762 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "32b7b48a-85e2-48f8-ac03-9d4fe0682c95" (UID: "32b7b48a-85e2-48f8-ac03-9d4fe0682c95"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.961027 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bad9fa6d-dc45-4aef-8380-5aa98401b840-client-ca\") pod \"route-controller-manager-5449b6d7cb-6lgdm\" (UID: \"bad9fa6d-dc45-4aef-8380-5aa98401b840\") " pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.961075 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bad9fa6d-dc45-4aef-8380-5aa98401b840-config\") pod \"route-controller-manager-5449b6d7cb-6lgdm\" (UID: \"bad9fa6d-dc45-4aef-8380-5aa98401b840\") " pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.961161 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bad9fa6d-dc45-4aef-8380-5aa98401b840-serving-cert\") pod \"route-controller-manager-5449b6d7cb-6lgdm\" (UID: \"bad9fa6d-dc45-4aef-8380-5aa98401b840\") " pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.961193 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bppnd\" (UniqueName: \"kubernetes.io/projected/bad9fa6d-dc45-4aef-8380-5aa98401b840-kube-api-access-bppnd\") pod \"route-controller-manager-5449b6d7cb-6lgdm\" (UID: \"bad9fa6d-dc45-4aef-8380-5aa98401b840\") " pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.961253 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.961265 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfjjv\" (UniqueName: \"kubernetes.io/projected/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-kube-api-access-qfjjv\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.961275 4925 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.961284 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32b7b48a-85e2-48f8-ac03-9d4fe0682c95-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.962681 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bad9fa6d-dc45-4aef-8380-5aa98401b840-config\") pod \"route-controller-manager-5449b6d7cb-6lgdm\" (UID: \"bad9fa6d-dc45-4aef-8380-5aa98401b840\") " pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.963299 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bad9fa6d-dc45-4aef-8380-5aa98401b840-client-ca\") pod \"route-controller-manager-5449b6d7cb-6lgdm\" (UID: \"bad9fa6d-dc45-4aef-8380-5aa98401b840\") " pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.965121 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bad9fa6d-dc45-4aef-8380-5aa98401b840-serving-cert\") pod \"route-controller-manager-5449b6d7cb-6lgdm\" (UID: \"bad9fa6d-dc45-4aef-8380-5aa98401b840\") " pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" Feb 26 08:49:02 crc kubenswrapper[4925]: I0226 08:49:02.978202 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bppnd\" (UniqueName: \"kubernetes.io/projected/bad9fa6d-dc45-4aef-8380-5aa98401b840-kube-api-access-bppnd\") pod \"route-controller-manager-5449b6d7cb-6lgdm\" (UID: \"bad9fa6d-dc45-4aef-8380-5aa98401b840\") " pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" Feb 26 08:49:03 crc kubenswrapper[4925]: I0226 08:49:03.198241 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" Feb 26 08:49:03 crc kubenswrapper[4925]: I0226 08:49:03.330735 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-btcxz"] Feb 26 08:49:03 crc kubenswrapper[4925]: I0226 08:49:03.720911 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" event={"ID":"32b7b48a-85e2-48f8-ac03-9d4fe0682c95","Type":"ContainerDied","Data":"471d8689c7ae54b8d0aedff0ecdf29696ea8e9e1061bb1c76c8d73e6d127bea5"} Feb 26 08:49:03 crc kubenswrapper[4925]: I0226 08:49:03.720993 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs" Feb 26 08:49:03 crc kubenswrapper[4925]: I0226 08:49:03.721168 4925 scope.go:117] "RemoveContainer" containerID="813f99b749ca6ab824da1f1cec6f68814e9669b12c7b47aa21de0db2bb856fc6" Feb 26 08:49:03 crc kubenswrapper[4925]: I0226 08:49:03.752072 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs"] Feb 26 08:49:03 crc kubenswrapper[4925]: I0226 08:49:03.755723 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f46886d5b-658cs"] Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.514895 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32b7b48a-85e2-48f8-ac03-9d4fe0682c95" path="/var/lib/kubelet/pods/32b7b48a-85e2-48f8-ac03-9d4fe0682c95/volumes" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.586167 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.683455 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4335457-df11-4f31-af34-2b6a70dbf1d7-config\") pod \"f4335457-df11-4f31-af34-2b6a70dbf1d7\" (UID: \"f4335457-df11-4f31-af34-2b6a70dbf1d7\") " Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.683583 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4335457-df11-4f31-af34-2b6a70dbf1d7-proxy-ca-bundles\") pod \"f4335457-df11-4f31-af34-2b6a70dbf1d7\" (UID: \"f4335457-df11-4f31-af34-2b6a70dbf1d7\") " Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.683632 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4335457-df11-4f31-af34-2b6a70dbf1d7-serving-cert\") pod \"f4335457-df11-4f31-af34-2b6a70dbf1d7\" (UID: \"f4335457-df11-4f31-af34-2b6a70dbf1d7\") " Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.683719 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nfgp\" (UniqueName: \"kubernetes.io/projected/f4335457-df11-4f31-af34-2b6a70dbf1d7-kube-api-access-4nfgp\") pod \"f4335457-df11-4f31-af34-2b6a70dbf1d7\" (UID: \"f4335457-df11-4f31-af34-2b6a70dbf1d7\") " Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.683801 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4335457-df11-4f31-af34-2b6a70dbf1d7-client-ca\") pod \"f4335457-df11-4f31-af34-2b6a70dbf1d7\" (UID: \"f4335457-df11-4f31-af34-2b6a70dbf1d7\") " Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.685204 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4335457-df11-4f31-af34-2b6a70dbf1d7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f4335457-df11-4f31-af34-2b6a70dbf1d7" (UID: "f4335457-df11-4f31-af34-2b6a70dbf1d7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.685289 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4335457-df11-4f31-af34-2b6a70dbf1d7-client-ca" (OuterVolumeSpecName: "client-ca") pod "f4335457-df11-4f31-af34-2b6a70dbf1d7" (UID: "f4335457-df11-4f31-af34-2b6a70dbf1d7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.685317 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4335457-df11-4f31-af34-2b6a70dbf1d7-config" (OuterVolumeSpecName: "config") pod "f4335457-df11-4f31-af34-2b6a70dbf1d7" (UID: "f4335457-df11-4f31-af34-2b6a70dbf1d7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.688407 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4335457-df11-4f31-af34-2b6a70dbf1d7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f4335457-df11-4f31-af34-2b6a70dbf1d7" (UID: "f4335457-df11-4f31-af34-2b6a70dbf1d7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.688521 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4335457-df11-4f31-af34-2b6a70dbf1d7-kube-api-access-4nfgp" (OuterVolumeSpecName: "kube-api-access-4nfgp") pod "f4335457-df11-4f31-af34-2b6a70dbf1d7" (UID: "f4335457-df11-4f31-af34-2b6a70dbf1d7"). InnerVolumeSpecName "kube-api-access-4nfgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.727934 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" event={"ID":"f4335457-df11-4f31-af34-2b6a70dbf1d7","Type":"ContainerDied","Data":"882aaf442e02f9321e0e4639fd57381a6f00aa6137c79cc5f4caf925e9f85ca1"} Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.727956 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.728012 4925 scope.go:117] "RemoveContainer" containerID="783f1330020deef7b55a91ce0e0481b697921f12ce5499e4be9fd59b494f2915" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.752049 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7ccbd9765b-nskfs"] Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.755702 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7ccbd9765b-nskfs"] Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.785618 4925 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4335457-df11-4f31-af34-2b6a70dbf1d7-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.785652 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4335457-df11-4f31-af34-2b6a70dbf1d7-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.785661 4925 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4335457-df11-4f31-af34-2b6a70dbf1d7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.785672 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4335457-df11-4f31-af34-2b6a70dbf1d7-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.785683 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nfgp\" (UniqueName: \"kubernetes.io/projected/f4335457-df11-4f31-af34-2b6a70dbf1d7-kube-api-access-4nfgp\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.897465 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-945755d8d-2gv8m"] Feb 26 08:49:04 crc kubenswrapper[4925]: E0226 08:49:04.897716 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4335457-df11-4f31-af34-2b6a70dbf1d7" containerName="controller-manager" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.897739 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4335457-df11-4f31-af34-2b6a70dbf1d7" containerName="controller-manager" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.897866 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4335457-df11-4f31-af34-2b6a70dbf1d7" containerName="controller-manager" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.898328 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.902050 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.902170 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.902409 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.902505 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.902651 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.903386 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.910487 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-945755d8d-2gv8m"] Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.911571 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.988121 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2a243e2-ace9-4720-af01-e657bc8fb8c8-serving-cert\") pod \"controller-manager-945755d8d-2gv8m\" (UID: \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\") " pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.988211 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2a243e2-ace9-4720-af01-e657bc8fb8c8-client-ca\") pod \"controller-manager-945755d8d-2gv8m\" (UID: \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\") " pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.988278 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2a243e2-ace9-4720-af01-e657bc8fb8c8-proxy-ca-bundles\") pod \"controller-manager-945755d8d-2gv8m\" (UID: \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\") " pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.988318 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf7gx\" (UniqueName: \"kubernetes.io/projected/d2a243e2-ace9-4720-af01-e657bc8fb8c8-kube-api-access-lf7gx\") pod \"controller-manager-945755d8d-2gv8m\" (UID: \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\") " pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" Feb 26 08:49:04 crc kubenswrapper[4925]: I0226 08:49:04.988427 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2a243e2-ace9-4720-af01-e657bc8fb8c8-config\") pod \"controller-manager-945755d8d-2gv8m\" (UID: \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\") " pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" Feb 26 08:49:05 crc kubenswrapper[4925]: I0226 08:49:05.089638 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2a243e2-ace9-4720-af01-e657bc8fb8c8-config\") pod \"controller-manager-945755d8d-2gv8m\" (UID: \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\") " pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" Feb 26 08:49:05 crc kubenswrapper[4925]: I0226 08:49:05.089691 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2a243e2-ace9-4720-af01-e657bc8fb8c8-serving-cert\") pod \"controller-manager-945755d8d-2gv8m\" (UID: \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\") " pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" Feb 26 08:49:05 crc kubenswrapper[4925]: I0226 08:49:05.089725 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2a243e2-ace9-4720-af01-e657bc8fb8c8-client-ca\") pod \"controller-manager-945755d8d-2gv8m\" (UID: \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\") " pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" Feb 26 08:49:05 crc kubenswrapper[4925]: I0226 08:49:05.090407 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2a243e2-ace9-4720-af01-e657bc8fb8c8-proxy-ca-bundles\") pod \"controller-manager-945755d8d-2gv8m\" (UID: \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\") " pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" Feb 26 08:49:05 crc kubenswrapper[4925]: I0226 08:49:05.090462 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf7gx\" (UniqueName: \"kubernetes.io/projected/d2a243e2-ace9-4720-af01-e657bc8fb8c8-kube-api-access-lf7gx\") pod \"controller-manager-945755d8d-2gv8m\" (UID: \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\") " pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" Feb 26 08:49:05 crc kubenswrapper[4925]: I0226 08:49:05.090733 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2a243e2-ace9-4720-af01-e657bc8fb8c8-client-ca\") pod \"controller-manager-945755d8d-2gv8m\" (UID: \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\") " pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" Feb 26 08:49:05 crc kubenswrapper[4925]: I0226 08:49:05.091335 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2a243e2-ace9-4720-af01-e657bc8fb8c8-proxy-ca-bundles\") pod \"controller-manager-945755d8d-2gv8m\" (UID: \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\") " pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" Feb 26 08:49:05 crc kubenswrapper[4925]: I0226 08:49:05.091669 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2a243e2-ace9-4720-af01-e657bc8fb8c8-config\") pod \"controller-manager-945755d8d-2gv8m\" (UID: \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\") " pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" Feb 26 08:49:05 crc kubenswrapper[4925]: I0226 08:49:05.099385 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2a243e2-ace9-4720-af01-e657bc8fb8c8-serving-cert\") pod \"controller-manager-945755d8d-2gv8m\" (UID: \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\") " pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" Feb 26 08:49:05 crc kubenswrapper[4925]: I0226 08:49:05.118311 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf7gx\" (UniqueName: \"kubernetes.io/projected/d2a243e2-ace9-4720-af01-e657bc8fb8c8-kube-api-access-lf7gx\") pod \"controller-manager-945755d8d-2gv8m\" (UID: \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\") " pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" Feb 26 08:49:05 crc kubenswrapper[4925]: I0226 08:49:05.200811 4925 patch_prober.go:28] interesting pod/controller-manager-7ccbd9765b-nskfs container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: i/o timeout" start-of-body= Feb 26 08:49:05 crc kubenswrapper[4925]: I0226 08:49:05.200910 4925 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7ccbd9765b-nskfs" podUID="f4335457-df11-4f31-af34-2b6a70dbf1d7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: i/o timeout" Feb 26 08:49:05 crc kubenswrapper[4925]: I0226 08:49:05.224347 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" Feb 26 08:49:06 crc kubenswrapper[4925]: I0226 08:49:06.262391 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zs499" Feb 26 08:49:06 crc kubenswrapper[4925]: I0226 08:49:06.326508 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zs499" Feb 26 08:49:06 crc kubenswrapper[4925]: I0226 08:49:06.333783 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cnjt4" Feb 26 08:49:06 crc kubenswrapper[4925]: I0226 08:49:06.333834 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cnjt4" Feb 26 08:49:06 crc kubenswrapper[4925]: I0226 08:49:06.380816 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cnjt4" Feb 26 08:49:06 crc kubenswrapper[4925]: I0226 08:49:06.514066 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4335457-df11-4f31-af34-2b6a70dbf1d7" path="/var/lib/kubelet/pods/f4335457-df11-4f31-af34-2b6a70dbf1d7/volumes" Feb 26 08:49:06 crc kubenswrapper[4925]: I0226 08:49:06.786954 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cnjt4" Feb 26 08:49:06 crc kubenswrapper[4925]: I0226 08:49:06.809397 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm"] Feb 26 08:49:07 crc kubenswrapper[4925]: I0226 08:49:07.931199 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cnjt4"] Feb 26 08:49:09 crc kubenswrapper[4925]: I0226 08:49:08.751694 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cnjt4" podUID="89b3dc84-3c4c-44fd-9cd3-1da9c62753e7" containerName="registry-server" containerID="cri-o://b77bb3d9e6b3343940942f43b73b1e2f717d9b110666577644f5b63a4a4201fe" gracePeriod=2 Feb 26 08:49:09 crc kubenswrapper[4925]: I0226 08:49:09.760202 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" event={"ID":"bad9fa6d-dc45-4aef-8380-5aa98401b840","Type":"ContainerStarted","Data":"9e24915a16c47c7cec3cc98cd26a7758fa31e17f1490851a842ce7233157c243"} Feb 26 08:49:10 crc kubenswrapper[4925]: I0226 08:49:10.189169 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-945755d8d-2gv8m"] Feb 26 08:49:10 crc kubenswrapper[4925]: W0226 08:49:10.197154 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2a243e2_ace9_4720_af01_e657bc8fb8c8.slice/crio-bc305c1ee2b9a492af0443c7ef45a5b851603c62872a18ba05e9c2c1d872f949 WatchSource:0}: Error finding container bc305c1ee2b9a492af0443c7ef45a5b851603c62872a18ba05e9c2c1d872f949: Status 404 returned error can't find the container with id bc305c1ee2b9a492af0443c7ef45a5b851603c62872a18ba05e9c2c1d872f949 Feb 26 08:49:10 crc kubenswrapper[4925]: I0226 08:49:10.767185 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" event={"ID":"d2a243e2-ace9-4720-af01-e657bc8fb8c8","Type":"ContainerStarted","Data":"bc305c1ee2b9a492af0443c7ef45a5b851603c62872a18ba05e9c2c1d872f949"} Feb 26 08:49:11 crc kubenswrapper[4925]: I0226 08:49:11.779641 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" event={"ID":"d2a243e2-ace9-4720-af01-e657bc8fb8c8","Type":"ContainerStarted","Data":"b5df47ba66a61b97f64698ca3baea726fb043b2be6345e966de89d2071ebaf68"} Feb 26 08:49:11 crc kubenswrapper[4925]: I0226 08:49:11.785195 4925 generic.go:334] "Generic (PLEG): container finished" podID="89b3dc84-3c4c-44fd-9cd3-1da9c62753e7" containerID="b77bb3d9e6b3343940942f43b73b1e2f717d9b110666577644f5b63a4a4201fe" exitCode=0 Feb 26 08:49:11 crc kubenswrapper[4925]: I0226 08:49:11.785268 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cnjt4" event={"ID":"89b3dc84-3c4c-44fd-9cd3-1da9c62753e7","Type":"ContainerDied","Data":"b77bb3d9e6b3343940942f43b73b1e2f717d9b110666577644f5b63a4a4201fe"} Feb 26 08:49:11 crc kubenswrapper[4925]: I0226 08:49:11.786962 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x9xkw" event={"ID":"5a8eea64-29d5-46bc-89db-23ad451eba4f","Type":"ContainerStarted","Data":"c22845da049700b9ec7327b515fba0bb7b11c146c7803d1b90c759b7b277467d"} Feb 26 08:49:11 crc kubenswrapper[4925]: I0226 08:49:11.788633 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgq9g" event={"ID":"b8aa32c8-5b82-46ac-a7e2-7d6399474aff","Type":"ContainerStarted","Data":"73a6e07adb99d67f8d2c0f1fb2ec016c86a5e24c69a252f8e78d8620ee3b3e34"} Feb 26 08:49:11 crc kubenswrapper[4925]: I0226 08:49:11.794959 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8g65" event={"ID":"cf852adc-ded1-43ba-9ed4-cf31d6eda903","Type":"ContainerStarted","Data":"50cb856ad17a717d1c908ab7d22a6429ce6ae6bd07ed3e2c2fc82feb4e023258"} Feb 26 08:49:11 crc kubenswrapper[4925]: I0226 08:49:11.796338 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" event={"ID":"bad9fa6d-dc45-4aef-8380-5aa98401b840","Type":"ContainerStarted","Data":"5682d91592263f23e0acf766e4be49d63286c1ba515cfe8cf2db7d787527fb15"} Feb 26 08:49:11 crc kubenswrapper[4925]: I0226 08:49:11.796983 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" Feb 26 08:49:12 crc kubenswrapper[4925]: I0226 08:49:12.237735 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" Feb 26 08:49:12 crc kubenswrapper[4925]: I0226 08:49:12.261846 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" podStartSLOduration=11.261830495 podStartE2EDuration="11.261830495s" podCreationTimestamp="2026-02-26 08:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:49:11.937157345 +0000 UTC m=+284.076731638" watchObservedRunningTime="2026-02-26 08:49:12.261830495 +0000 UTC m=+284.401404788" Feb 26 08:49:12 crc kubenswrapper[4925]: I0226 08:49:12.263717 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cnjt4" Feb 26 08:49:12 crc kubenswrapper[4925]: I0226 08:49:12.363790 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9drdq\" (UniqueName: \"kubernetes.io/projected/89b3dc84-3c4c-44fd-9cd3-1da9c62753e7-kube-api-access-9drdq\") pod \"89b3dc84-3c4c-44fd-9cd3-1da9c62753e7\" (UID: \"89b3dc84-3c4c-44fd-9cd3-1da9c62753e7\") " Feb 26 08:49:12 crc kubenswrapper[4925]: I0226 08:49:12.363842 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b3dc84-3c4c-44fd-9cd3-1da9c62753e7-catalog-content\") pod \"89b3dc84-3c4c-44fd-9cd3-1da9c62753e7\" (UID: \"89b3dc84-3c4c-44fd-9cd3-1da9c62753e7\") " Feb 26 08:49:12 crc kubenswrapper[4925]: I0226 08:49:12.363910 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b3dc84-3c4c-44fd-9cd3-1da9c62753e7-utilities\") pod \"89b3dc84-3c4c-44fd-9cd3-1da9c62753e7\" (UID: \"89b3dc84-3c4c-44fd-9cd3-1da9c62753e7\") " Feb 26 08:49:12 crc kubenswrapper[4925]: I0226 08:49:12.364822 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89b3dc84-3c4c-44fd-9cd3-1da9c62753e7-utilities" (OuterVolumeSpecName: "utilities") pod "89b3dc84-3c4c-44fd-9cd3-1da9c62753e7" (UID: "89b3dc84-3c4c-44fd-9cd3-1da9c62753e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:49:12 crc kubenswrapper[4925]: I0226 08:49:12.456064 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89b3dc84-3c4c-44fd-9cd3-1da9c62753e7-kube-api-access-9drdq" (OuterVolumeSpecName: "kube-api-access-9drdq") pod "89b3dc84-3c4c-44fd-9cd3-1da9c62753e7" (UID: "89b3dc84-3c4c-44fd-9cd3-1da9c62753e7"). InnerVolumeSpecName "kube-api-access-9drdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:49:12 crc kubenswrapper[4925]: I0226 08:49:12.466269 4925 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89b3dc84-3c4c-44fd-9cd3-1da9c62753e7-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:12 crc kubenswrapper[4925]: I0226 08:49:12.466309 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9drdq\" (UniqueName: \"kubernetes.io/projected/89b3dc84-3c4c-44fd-9cd3-1da9c62753e7-kube-api-access-9drdq\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:12 crc kubenswrapper[4925]: I0226 08:49:12.544376 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89b3dc84-3c4c-44fd-9cd3-1da9c62753e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89b3dc84-3c4c-44fd-9cd3-1da9c62753e7" (UID: "89b3dc84-3c4c-44fd-9cd3-1da9c62753e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:49:13 crc kubenswrapper[4925]: I0226 08:49:12.568835 4925 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89b3dc84-3c4c-44fd-9cd3-1da9c62753e7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:13 crc kubenswrapper[4925]: I0226 08:49:13.122001 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cnjt4" Feb 26 08:49:13 crc kubenswrapper[4925]: I0226 08:49:13.124615 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cnjt4" event={"ID":"89b3dc84-3c4c-44fd-9cd3-1da9c62753e7","Type":"ContainerDied","Data":"0e0b9da4888bcc4bd6d757b8bb8ecf603e4959bd40b8966ded4f6918c7c7e1c9"} Feb 26 08:49:13 crc kubenswrapper[4925]: I0226 08:49:13.124694 4925 scope.go:117] "RemoveContainer" containerID="b77bb3d9e6b3343940942f43b73b1e2f717d9b110666577644f5b63a4a4201fe" Feb 26 08:49:13 crc kubenswrapper[4925]: I0226 08:49:13.124871 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" Feb 26 08:49:13 crc kubenswrapper[4925]: I0226 08:49:13.137689 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" Feb 26 08:49:13 crc kubenswrapper[4925]: I0226 08:49:13.239676 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" podStartSLOduration=12.239657578 podStartE2EDuration="12.239657578s" podCreationTimestamp="2026-02-26 08:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:49:13.237862334 +0000 UTC m=+285.377436617" watchObservedRunningTime="2026-02-26 08:49:13.239657578 +0000 UTC m=+285.379231871" Feb 26 08:49:13 crc kubenswrapper[4925]: I0226 08:49:13.296347 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c8g65" podStartSLOduration=5.828462832 podStartE2EDuration="1m10.296319819s" podCreationTimestamp="2026-02-26 08:48:03 +0000 UTC" firstStartedPulling="2026-02-26 08:48:05.028657427 +0000 UTC m=+217.168231710" lastFinishedPulling="2026-02-26 08:49:09.496514414 +0000 UTC m=+281.636088697" observedRunningTime="2026-02-26 08:49:13.294609697 +0000 UTC m=+285.434183990" watchObservedRunningTime="2026-02-26 08:49:13.296319819 +0000 UTC m=+285.435894102" Feb 26 08:49:13 crc kubenswrapper[4925]: I0226 08:49:13.326154 4925 scope.go:117] "RemoveContainer" containerID="9b62e6677a2e4e8bae0144473b485814a7236d35b70441f35a1d5408db8db608" Feb 26 08:49:13 crc kubenswrapper[4925]: I0226 08:49:13.333487 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cnjt4"] Feb 26 08:49:13 crc kubenswrapper[4925]: I0226 08:49:13.338472 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cnjt4"] Feb 26 08:49:13 crc kubenswrapper[4925]: I0226 08:49:13.351501 4925 scope.go:117] "RemoveContainer" containerID="aabd394629694064e97e24c7db8f0e6fecdba14a42a632ac2bd400e9ab9ddbdf" Feb 26 08:49:13 crc kubenswrapper[4925]: I0226 08:49:13.611605 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c8g65" Feb 26 08:49:13 crc kubenswrapper[4925]: I0226 08:49:13.611715 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c8g65" Feb 26 08:49:14 crc kubenswrapper[4925]: I0226 08:49:14.128195 4925 generic.go:334] "Generic (PLEG): container finished" podID="5a8eea64-29d5-46bc-89db-23ad451eba4f" containerID="c22845da049700b9ec7327b515fba0bb7b11c146c7803d1b90c759b7b277467d" exitCode=0 Feb 26 08:49:14 crc kubenswrapper[4925]: I0226 08:49:14.128248 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x9xkw" event={"ID":"5a8eea64-29d5-46bc-89db-23ad451eba4f","Type":"ContainerDied","Data":"c22845da049700b9ec7327b515fba0bb7b11c146c7803d1b90c759b7b277467d"} Feb 26 08:49:14 crc kubenswrapper[4925]: I0226 08:49:14.131653 4925 generic.go:334] "Generic (PLEG): container finished" podID="b8aa32c8-5b82-46ac-a7e2-7d6399474aff" containerID="73a6e07adb99d67f8d2c0f1fb2ec016c86a5e24c69a252f8e78d8620ee3b3e34" exitCode=0 Feb 26 08:49:14 crc kubenswrapper[4925]: I0226 08:49:14.131753 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgq9g" event={"ID":"b8aa32c8-5b82-46ac-a7e2-7d6399474aff","Type":"ContainerDied","Data":"73a6e07adb99d67f8d2c0f1fb2ec016c86a5e24c69a252f8e78d8620ee3b3e34"} Feb 26 08:49:14 crc kubenswrapper[4925]: I0226 08:49:14.645071 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89b3dc84-3c4c-44fd-9cd3-1da9c62753e7" path="/var/lib/kubelet/pods/89b3dc84-3c4c-44fd-9cd3-1da9c62753e7/volumes" Feb 26 08:49:14 crc kubenswrapper[4925]: I0226 08:49:14.810915 4925 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-c8g65" podUID="cf852adc-ded1-43ba-9ed4-cf31d6eda903" containerName="registry-server" probeResult="failure" output=< Feb 26 08:49:14 crc kubenswrapper[4925]: timeout: failed to connect service ":50051" within 1s Feb 26 08:49:14 crc kubenswrapper[4925]: > Feb 26 08:49:15 crc kubenswrapper[4925]: I0226 08:49:15.137081 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdzrh" event={"ID":"78d95f85-01b1-4a6b-8c48-b477f840035d","Type":"ContainerStarted","Data":"d5c5033d869531d932c4966861ba74c3154e78ca0a4108ad26b782187f151b02"} Feb 26 08:49:15 crc kubenswrapper[4925]: I0226 08:49:15.138572 4925 generic.go:334] "Generic (PLEG): container finished" podID="e99e124b-4917-49df-8c28-04373cfed9d3" containerID="306e49a357fd31d97ca234ad4c1f297628844dad259f2ae2e16b99759749d51b" exitCode=0 Feb 26 08:49:15 crc kubenswrapper[4925]: I0226 08:49:15.138632 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pkxc" event={"ID":"e99e124b-4917-49df-8c28-04373cfed9d3","Type":"ContainerDied","Data":"306e49a357fd31d97ca234ad4c1f297628844dad259f2ae2e16b99759749d51b"} Feb 26 08:49:15 crc kubenswrapper[4925]: I0226 08:49:15.141223 4925 generic.go:334] "Generic (PLEG): container finished" podID="ce17c01b-f0a9-4f82-9512-028c8f673f85" containerID="795f784e1b60ad89295544dc1390e730f3c25ffcb77de7bc20af4e3016f3f5ab" exitCode=0 Feb 26 08:49:15 crc kubenswrapper[4925]: I0226 08:49:15.141285 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mrq8g" event={"ID":"ce17c01b-f0a9-4f82-9512-028c8f673f85","Type":"ContainerDied","Data":"795f784e1b60ad89295544dc1390e730f3c25ffcb77de7bc20af4e3016f3f5ab"} Feb 26 08:49:15 crc kubenswrapper[4925]: I0226 08:49:15.147142 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x9xkw" event={"ID":"5a8eea64-29d5-46bc-89db-23ad451eba4f","Type":"ContainerStarted","Data":"5968d9351a8ad434084b5595573ee607bfaa6c8e0927f84642ce7833416d0062"} Feb 26 08:49:15 crc kubenswrapper[4925]: I0226 08:49:15.172725 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgq9g" event={"ID":"b8aa32c8-5b82-46ac-a7e2-7d6399474aff","Type":"ContainerStarted","Data":"fb7ce64727a73d224841a28a1448600af4563f8cc3161964be5b24b1f67f6e50"} Feb 26 08:49:15 crc kubenswrapper[4925]: I0226 08:49:15.191271 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x9xkw" podStartSLOduration=3.473331157 podStartE2EDuration="1m13.191250794s" podCreationTimestamp="2026-02-26 08:48:02 +0000 UTC" firstStartedPulling="2026-02-26 08:48:05.067553337 +0000 UTC m=+217.207127620" lastFinishedPulling="2026-02-26 08:49:14.785472974 +0000 UTC m=+286.925047257" observedRunningTime="2026-02-26 08:49:15.188111377 +0000 UTC m=+287.327685670" watchObservedRunningTime="2026-02-26 08:49:15.191250794 +0000 UTC m=+287.330825097" Feb 26 08:49:15 crc kubenswrapper[4925]: I0226 08:49:15.331641 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xgq9g" podStartSLOduration=2.558589095 podStartE2EDuration="1m13.3316225s" podCreationTimestamp="2026-02-26 08:48:02 +0000 UTC" firstStartedPulling="2026-02-26 08:48:03.99255518 +0000 UTC m=+216.132129463" lastFinishedPulling="2026-02-26 08:49:14.765588585 +0000 UTC m=+286.905162868" observedRunningTime="2026-02-26 08:49:15.331255911 +0000 UTC m=+287.470830204" watchObservedRunningTime="2026-02-26 08:49:15.3316225 +0000 UTC m=+287.471196793" Feb 26 08:49:15 crc kubenswrapper[4925]: I0226 08:49:15.872355 4925 patch_prober.go:28] interesting pod/machine-config-daemon-s6pcl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 08:49:15 crc kubenswrapper[4925]: I0226 08:49:15.872748 4925 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" podUID="52bdf0f6-9afd-42fe-9a30-7da82bc7f619" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 08:49:15 crc kubenswrapper[4925]: I0226 08:49:15.872813 4925 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" Feb 26 08:49:15 crc kubenswrapper[4925]: I0226 08:49:15.873462 4925 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c"} pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 08:49:15 crc kubenswrapper[4925]: I0226 08:49:15.873560 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" podUID="52bdf0f6-9afd-42fe-9a30-7da82bc7f619" containerName="machine-config-daemon" containerID="cri-o://767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c" gracePeriod=600 Feb 26 08:49:16 crc kubenswrapper[4925]: I0226 08:49:16.402579 4925 generic.go:334] "Generic (PLEG): container finished" podID="78d95f85-01b1-4a6b-8c48-b477f840035d" containerID="d5c5033d869531d932c4966861ba74c3154e78ca0a4108ad26b782187f151b02" exitCode=0 Feb 26 08:49:16 crc kubenswrapper[4925]: I0226 08:49:16.402716 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdzrh" event={"ID":"78d95f85-01b1-4a6b-8c48-b477f840035d","Type":"ContainerDied","Data":"d5c5033d869531d932c4966861ba74c3154e78ca0a4108ad26b782187f151b02"} Feb 26 08:49:16 crc kubenswrapper[4925]: I0226 08:49:16.412671 4925 generic.go:334] "Generic (PLEG): container finished" podID="52bdf0f6-9afd-42fe-9a30-7da82bc7f619" containerID="767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c" exitCode=0 Feb 26 08:49:16 crc kubenswrapper[4925]: I0226 08:49:16.412775 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" event={"ID":"52bdf0f6-9afd-42fe-9a30-7da82bc7f619","Type":"ContainerDied","Data":"767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c"} Feb 26 08:49:16 crc kubenswrapper[4925]: I0226 08:49:16.417022 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pkxc" event={"ID":"e99e124b-4917-49df-8c28-04373cfed9d3","Type":"ContainerStarted","Data":"69ed512e9218b36df073edbe72da71baa2eec9a550b8d7dd3ed93879d034b9bc"} Feb 26 08:49:16 crc kubenswrapper[4925]: I0226 08:49:16.422021 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mrq8g" event={"ID":"ce17c01b-f0a9-4f82-9512-028c8f673f85","Type":"ContainerStarted","Data":"ab4c89104c314fc942ee59ae17bf5cecafa7ef15189c401d3e135a7b520c22b8"} Feb 26 08:49:16 crc kubenswrapper[4925]: I0226 08:49:16.469590 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mrq8g" podStartSLOduration=3.037679067 podStartE2EDuration="1m12.469567023s" podCreationTimestamp="2026-02-26 08:48:04 +0000 UTC" firstStartedPulling="2026-02-26 08:48:06.137535901 +0000 UTC m=+218.277110184" lastFinishedPulling="2026-02-26 08:49:15.569423857 +0000 UTC m=+287.708998140" observedRunningTime="2026-02-26 08:49:16.46618562 +0000 UTC m=+288.605759903" watchObservedRunningTime="2026-02-26 08:49:16.469567023 +0000 UTC m=+288.609141306" Feb 26 08:49:16 crc kubenswrapper[4925]: I0226 08:49:16.511981 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7pkxc" podStartSLOduration=3.158112415 podStartE2EDuration="1m11.511951513s" podCreationTimestamp="2026-02-26 08:48:05 +0000 UTC" firstStartedPulling="2026-02-26 08:48:07.252777881 +0000 UTC m=+219.392352164" lastFinishedPulling="2026-02-26 08:49:15.606616979 +0000 UTC m=+287.746191262" observedRunningTime="2026-02-26 08:49:16.502131872 +0000 UTC m=+288.641706175" watchObservedRunningTime="2026-02-26 08:49:16.511951513 +0000 UTC m=+288.651525796" Feb 26 08:49:17 crc kubenswrapper[4925]: I0226 08:49:17.432411 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdzrh" event={"ID":"78d95f85-01b1-4a6b-8c48-b477f840035d","Type":"ContainerStarted","Data":"d451798681893aeb29c419ec492cd519b02b9b4b7be97f5101faca564df8f70a"} Feb 26 08:49:17 crc kubenswrapper[4925]: I0226 08:49:17.438113 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" event={"ID":"52bdf0f6-9afd-42fe-9a30-7da82bc7f619","Type":"ContainerStarted","Data":"0b9902dc2e31bf3f83affdad185ade7bc019efcdf1759a693c00f03b44bf4d0a"} Feb 26 08:49:18 crc kubenswrapper[4925]: I0226 08:49:18.477247 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gdzrh" podStartSLOduration=4.73994566 podStartE2EDuration="1m16.477208085s" podCreationTimestamp="2026-02-26 08:48:02 +0000 UTC" firstStartedPulling="2026-02-26 08:48:05.063051867 +0000 UTC m=+217.202626150" lastFinishedPulling="2026-02-26 08:49:16.800314292 +0000 UTC m=+288.939888575" observedRunningTime="2026-02-26 08:49:18.472867139 +0000 UTC m=+290.612441452" watchObservedRunningTime="2026-02-26 08:49:18.477208085 +0000 UTC m=+290.616782368" Feb 26 08:49:21 crc kubenswrapper[4925]: I0226 08:49:21.711244 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-945755d8d-2gv8m"] Feb 26 08:49:21 crc kubenswrapper[4925]: I0226 08:49:21.712028 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" podUID="d2a243e2-ace9-4720-af01-e657bc8fb8c8" containerName="controller-manager" containerID="cri-o://b5df47ba66a61b97f64698ca3baea726fb043b2be6345e966de89d2071ebaf68" gracePeriod=30 Feb 26 08:49:21 crc kubenswrapper[4925]: I0226 08:49:21.813711 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm"] Feb 26 08:49:21 crc kubenswrapper[4925]: I0226 08:49:21.813901 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" podUID="bad9fa6d-dc45-4aef-8380-5aa98401b840" containerName="route-controller-manager" containerID="cri-o://5682d91592263f23e0acf766e4be49d63286c1ba515cfe8cf2db7d787527fb15" gracePeriod=30 Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.274378 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.279607 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.437669 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bad9fa6d-dc45-4aef-8380-5aa98401b840-serving-cert\") pod \"bad9fa6d-dc45-4aef-8380-5aa98401b840\" (UID: \"bad9fa6d-dc45-4aef-8380-5aa98401b840\") " Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.437734 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2a243e2-ace9-4720-af01-e657bc8fb8c8-serving-cert\") pod \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\" (UID: \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\") " Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.437771 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2a243e2-ace9-4720-af01-e657bc8fb8c8-proxy-ca-bundles\") pod \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\" (UID: \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\") " Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.437817 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bppnd\" (UniqueName: \"kubernetes.io/projected/bad9fa6d-dc45-4aef-8380-5aa98401b840-kube-api-access-bppnd\") pod \"bad9fa6d-dc45-4aef-8380-5aa98401b840\" (UID: \"bad9fa6d-dc45-4aef-8380-5aa98401b840\") " Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.437907 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bad9fa6d-dc45-4aef-8380-5aa98401b840-client-ca\") pod \"bad9fa6d-dc45-4aef-8380-5aa98401b840\" (UID: \"bad9fa6d-dc45-4aef-8380-5aa98401b840\") " Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.437960 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2a243e2-ace9-4720-af01-e657bc8fb8c8-client-ca\") pod \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\" (UID: \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\") " Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.437978 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2a243e2-ace9-4720-af01-e657bc8fb8c8-config\") pod \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\" (UID: \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\") " Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.438073 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf7gx\" (UniqueName: \"kubernetes.io/projected/d2a243e2-ace9-4720-af01-e657bc8fb8c8-kube-api-access-lf7gx\") pod \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\" (UID: \"d2a243e2-ace9-4720-af01-e657bc8fb8c8\") " Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.438297 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bad9fa6d-dc45-4aef-8380-5aa98401b840-config\") pod \"bad9fa6d-dc45-4aef-8380-5aa98401b840\" (UID: \"bad9fa6d-dc45-4aef-8380-5aa98401b840\") " Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.439293 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2a243e2-ace9-4720-af01-e657bc8fb8c8-client-ca" (OuterVolumeSpecName: "client-ca") pod "d2a243e2-ace9-4720-af01-e657bc8fb8c8" (UID: "d2a243e2-ace9-4720-af01-e657bc8fb8c8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.439335 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bad9fa6d-dc45-4aef-8380-5aa98401b840-client-ca" (OuterVolumeSpecName: "client-ca") pod "bad9fa6d-dc45-4aef-8380-5aa98401b840" (UID: "bad9fa6d-dc45-4aef-8380-5aa98401b840"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.439364 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2a243e2-ace9-4720-af01-e657bc8fb8c8-config" (OuterVolumeSpecName: "config") pod "d2a243e2-ace9-4720-af01-e657bc8fb8c8" (UID: "d2a243e2-ace9-4720-af01-e657bc8fb8c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.439316 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bad9fa6d-dc45-4aef-8380-5aa98401b840-config" (OuterVolumeSpecName: "config") pod "bad9fa6d-dc45-4aef-8380-5aa98401b840" (UID: "bad9fa6d-dc45-4aef-8380-5aa98401b840"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.439885 4925 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bad9fa6d-dc45-4aef-8380-5aa98401b840-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.439904 4925 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2a243e2-ace9-4720-af01-e657bc8fb8c8-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.439917 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2a243e2-ace9-4720-af01-e657bc8fb8c8-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.439927 4925 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bad9fa6d-dc45-4aef-8380-5aa98401b840-config\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.440019 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2a243e2-ace9-4720-af01-e657bc8fb8c8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d2a243e2-ace9-4720-af01-e657bc8fb8c8" (UID: "d2a243e2-ace9-4720-af01-e657bc8fb8c8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.446005 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bad9fa6d-dc45-4aef-8380-5aa98401b840-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bad9fa6d-dc45-4aef-8380-5aa98401b840" (UID: "bad9fa6d-dc45-4aef-8380-5aa98401b840"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.446148 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2a243e2-ace9-4720-af01-e657bc8fb8c8-kube-api-access-lf7gx" (OuterVolumeSpecName: "kube-api-access-lf7gx") pod "d2a243e2-ace9-4720-af01-e657bc8fb8c8" (UID: "d2a243e2-ace9-4720-af01-e657bc8fb8c8"). InnerVolumeSpecName "kube-api-access-lf7gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.454343 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2a243e2-ace9-4720-af01-e657bc8fb8c8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d2a243e2-ace9-4720-af01-e657bc8fb8c8" (UID: "d2a243e2-ace9-4720-af01-e657bc8fb8c8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.454465 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bad9fa6d-dc45-4aef-8380-5aa98401b840-kube-api-access-bppnd" (OuterVolumeSpecName: "kube-api-access-bppnd") pod "bad9fa6d-dc45-4aef-8380-5aa98401b840" (UID: "bad9fa6d-dc45-4aef-8380-5aa98401b840"). InnerVolumeSpecName "kube-api-access-bppnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.541126 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lf7gx\" (UniqueName: \"kubernetes.io/projected/d2a243e2-ace9-4720-af01-e657bc8fb8c8-kube-api-access-lf7gx\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.541162 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bad9fa6d-dc45-4aef-8380-5aa98401b840-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.541176 4925 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2a243e2-ace9-4720-af01-e657bc8fb8c8-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.541188 4925 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2a243e2-ace9-4720-af01-e657bc8fb8c8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.541200 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bppnd\" (UniqueName: \"kubernetes.io/projected/bad9fa6d-dc45-4aef-8380-5aa98401b840-kube-api-access-bppnd\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.640230 4925 generic.go:334] "Generic (PLEG): container finished" podID="bad9fa6d-dc45-4aef-8380-5aa98401b840" containerID="5682d91592263f23e0acf766e4be49d63286c1ba515cfe8cf2db7d787527fb15" exitCode=0 Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.640386 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.640509 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" event={"ID":"bad9fa6d-dc45-4aef-8380-5aa98401b840","Type":"ContainerDied","Data":"5682d91592263f23e0acf766e4be49d63286c1ba515cfe8cf2db7d787527fb15"} Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.640632 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm" event={"ID":"bad9fa6d-dc45-4aef-8380-5aa98401b840","Type":"ContainerDied","Data":"9e24915a16c47c7cec3cc98cd26a7758fa31e17f1490851a842ce7233157c243"} Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.640660 4925 scope.go:117] "RemoveContainer" containerID="5682d91592263f23e0acf766e4be49d63286c1ba515cfe8cf2db7d787527fb15" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.641775 4925 generic.go:334] "Generic (PLEG): container finished" podID="d2a243e2-ace9-4720-af01-e657bc8fb8c8" containerID="b5df47ba66a61b97f64698ca3baea726fb043b2be6345e966de89d2071ebaf68" exitCode=0 Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.641875 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" event={"ID":"d2a243e2-ace9-4720-af01-e657bc8fb8c8","Type":"ContainerDied","Data":"b5df47ba66a61b97f64698ca3baea726fb043b2be6345e966de89d2071ebaf68"} Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.641941 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" event={"ID":"d2a243e2-ace9-4720-af01-e657bc8fb8c8","Type":"ContainerDied","Data":"bc305c1ee2b9a492af0443c7ef45a5b851603c62872a18ba05e9c2c1d872f949"} Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.641942 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-945755d8d-2gv8m" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.667999 4925 scope.go:117] "RemoveContainer" containerID="5682d91592263f23e0acf766e4be49d63286c1ba515cfe8cf2db7d787527fb15" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.668142 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm"] Feb 26 08:49:22 crc kubenswrapper[4925]: E0226 08:49:22.668455 4925 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5682d91592263f23e0acf766e4be49d63286c1ba515cfe8cf2db7d787527fb15\": container with ID starting with 5682d91592263f23e0acf766e4be49d63286c1ba515cfe8cf2db7d787527fb15 not found: ID does not exist" containerID="5682d91592263f23e0acf766e4be49d63286c1ba515cfe8cf2db7d787527fb15" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.668483 4925 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5682d91592263f23e0acf766e4be49d63286c1ba515cfe8cf2db7d787527fb15"} err="failed to get container status \"5682d91592263f23e0acf766e4be49d63286c1ba515cfe8cf2db7d787527fb15\": rpc error: code = NotFound desc = could not find container \"5682d91592263f23e0acf766e4be49d63286c1ba515cfe8cf2db7d787527fb15\": container with ID starting with 5682d91592263f23e0acf766e4be49d63286c1ba515cfe8cf2db7d787527fb15 not found: ID does not exist" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.668505 4925 scope.go:117] "RemoveContainer" containerID="b5df47ba66a61b97f64698ca3baea726fb043b2be6345e966de89d2071ebaf68" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.674584 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5449b6d7cb-6lgdm"] Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.678650 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-945755d8d-2gv8m"] Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.681684 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-945755d8d-2gv8m"] Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.691660 4925 scope.go:117] "RemoveContainer" containerID="b5df47ba66a61b97f64698ca3baea726fb043b2be6345e966de89d2071ebaf68" Feb 26 08:49:22 crc kubenswrapper[4925]: E0226 08:49:22.692071 4925 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5df47ba66a61b97f64698ca3baea726fb043b2be6345e966de89d2071ebaf68\": container with ID starting with b5df47ba66a61b97f64698ca3baea726fb043b2be6345e966de89d2071ebaf68 not found: ID does not exist" containerID="b5df47ba66a61b97f64698ca3baea726fb043b2be6345e966de89d2071ebaf68" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.692119 4925 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5df47ba66a61b97f64698ca3baea726fb043b2be6345e966de89d2071ebaf68"} err="failed to get container status \"b5df47ba66a61b97f64698ca3baea726fb043b2be6345e966de89d2071ebaf68\": rpc error: code = NotFound desc = could not find container \"b5df47ba66a61b97f64698ca3baea726fb043b2be6345e966de89d2071ebaf68\": container with ID starting with b5df47ba66a61b97f64698ca3baea726fb043b2be6345e966de89d2071ebaf68 not found: ID does not exist" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.966039 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xgq9g" Feb 26 08:49:22 crc kubenswrapper[4925]: I0226 08:49:22.966600 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xgq9g" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.021129 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xgq9g" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.132712 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74c5966c96-94dqz"] Feb 26 08:49:23 crc kubenswrapper[4925]: E0226 08:49:23.132997 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a243e2-ace9-4720-af01-e657bc8fb8c8" containerName="controller-manager" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.133028 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a243e2-ace9-4720-af01-e657bc8fb8c8" containerName="controller-manager" Feb 26 08:49:23 crc kubenswrapper[4925]: E0226 08:49:23.133043 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad9fa6d-dc45-4aef-8380-5aa98401b840" containerName="route-controller-manager" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.133050 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad9fa6d-dc45-4aef-8380-5aa98401b840" containerName="route-controller-manager" Feb 26 08:49:23 crc kubenswrapper[4925]: E0226 08:49:23.133060 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b3dc84-3c4c-44fd-9cd3-1da9c62753e7" containerName="registry-server" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.133066 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b3dc84-3c4c-44fd-9cd3-1da9c62753e7" containerName="registry-server" Feb 26 08:49:23 crc kubenswrapper[4925]: E0226 08:49:23.133082 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b3dc84-3c4c-44fd-9cd3-1da9c62753e7" containerName="extract-utilities" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.133088 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b3dc84-3c4c-44fd-9cd3-1da9c62753e7" containerName="extract-utilities" Feb 26 08:49:23 crc kubenswrapper[4925]: E0226 08:49:23.133095 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b3dc84-3c4c-44fd-9cd3-1da9c62753e7" containerName="extract-content" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.133103 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b3dc84-3c4c-44fd-9cd3-1da9c62753e7" containerName="extract-content" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.133190 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b3dc84-3c4c-44fd-9cd3-1da9c62753e7" containerName="registry-server" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.133199 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2a243e2-ace9-4720-af01-e657bc8fb8c8" containerName="controller-manager" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.133207 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="bad9fa6d-dc45-4aef-8380-5aa98401b840" containerName="route-controller-manager" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.134392 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74c5966c96-94dqz" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.142349 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.148103 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.148437 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.150139 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x9xkw" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.150210 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x9xkw" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.151012 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.151062 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.152201 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.152482 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c758d44dd-ll286"] Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.153818 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c758d44dd-ll286" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.159346 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c758d44dd-ll286"] Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.159787 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.160227 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.160240 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.160473 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.163284 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.163473 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.164982 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74c5966c96-94dqz"] Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.173709 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.198638 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x9xkw" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.254354 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95a83bad-f937-4350-91c3-f37000e465f8-config\") pod \"route-controller-manager-74c5966c96-94dqz\" (UID: \"95a83bad-f937-4350-91c3-f37000e465f8\") " pod="openshift-route-controller-manager/route-controller-manager-74c5966c96-94dqz" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.254421 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq9z5\" (UniqueName: \"kubernetes.io/projected/95a83bad-f937-4350-91c3-f37000e465f8-kube-api-access-cq9z5\") pod \"route-controller-manager-74c5966c96-94dqz\" (UID: \"95a83bad-f937-4350-91c3-f37000e465f8\") " pod="openshift-route-controller-manager/route-controller-manager-74c5966c96-94dqz" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.254461 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0451d31d-3c1b-4a85-8437-5c40000f3792-client-ca\") pod \"controller-manager-6c758d44dd-ll286\" (UID: \"0451d31d-3c1b-4a85-8437-5c40000f3792\") " pod="openshift-controller-manager/controller-manager-6c758d44dd-ll286" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.254487 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/95a83bad-f937-4350-91c3-f37000e465f8-client-ca\") pod \"route-controller-manager-74c5966c96-94dqz\" (UID: \"95a83bad-f937-4350-91c3-f37000e465f8\") " pod="openshift-route-controller-manager/route-controller-manager-74c5966c96-94dqz" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.254585 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95a83bad-f937-4350-91c3-f37000e465f8-serving-cert\") pod \"route-controller-manager-74c5966c96-94dqz\" (UID: \"95a83bad-f937-4350-91c3-f37000e465f8\") " pod="openshift-route-controller-manager/route-controller-manager-74c5966c96-94dqz" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.254636 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0451d31d-3c1b-4a85-8437-5c40000f3792-config\") pod \"controller-manager-6c758d44dd-ll286\" (UID: \"0451d31d-3c1b-4a85-8437-5c40000f3792\") " pod="openshift-controller-manager/controller-manager-6c758d44dd-ll286" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.254667 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0451d31d-3c1b-4a85-8437-5c40000f3792-proxy-ca-bundles\") pod \"controller-manager-6c758d44dd-ll286\" (UID: \"0451d31d-3c1b-4a85-8437-5c40000f3792\") " pod="openshift-controller-manager/controller-manager-6c758d44dd-ll286" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.254918 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6klqx\" (UniqueName: \"kubernetes.io/projected/0451d31d-3c1b-4a85-8437-5c40000f3792-kube-api-access-6klqx\") pod \"controller-manager-6c758d44dd-ll286\" (UID: \"0451d31d-3c1b-4a85-8437-5c40000f3792\") " pod="openshift-controller-manager/controller-manager-6c758d44dd-ll286" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.254995 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0451d31d-3c1b-4a85-8437-5c40000f3792-serving-cert\") pod \"controller-manager-6c758d44dd-ll286\" (UID: \"0451d31d-3c1b-4a85-8437-5c40000f3792\") " pod="openshift-controller-manager/controller-manager-6c758d44dd-ll286" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.355835 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6klqx\" (UniqueName: \"kubernetes.io/projected/0451d31d-3c1b-4a85-8437-5c40000f3792-kube-api-access-6klqx\") pod \"controller-manager-6c758d44dd-ll286\" (UID: \"0451d31d-3c1b-4a85-8437-5c40000f3792\") " pod="openshift-controller-manager/controller-manager-6c758d44dd-ll286" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.355898 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0451d31d-3c1b-4a85-8437-5c40000f3792-serving-cert\") pod \"controller-manager-6c758d44dd-ll286\" (UID: \"0451d31d-3c1b-4a85-8437-5c40000f3792\") " pod="openshift-controller-manager/controller-manager-6c758d44dd-ll286" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.355929 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95a83bad-f937-4350-91c3-f37000e465f8-config\") pod \"route-controller-manager-74c5966c96-94dqz\" (UID: \"95a83bad-f937-4350-91c3-f37000e465f8\") " pod="openshift-route-controller-manager/route-controller-manager-74c5966c96-94dqz" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.355950 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq9z5\" (UniqueName: \"kubernetes.io/projected/95a83bad-f937-4350-91c3-f37000e465f8-kube-api-access-cq9z5\") pod \"route-controller-manager-74c5966c96-94dqz\" (UID: \"95a83bad-f937-4350-91c3-f37000e465f8\") " pod="openshift-route-controller-manager/route-controller-manager-74c5966c96-94dqz" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.355978 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0451d31d-3c1b-4a85-8437-5c40000f3792-client-ca\") pod \"controller-manager-6c758d44dd-ll286\" (UID: \"0451d31d-3c1b-4a85-8437-5c40000f3792\") " pod="openshift-controller-manager/controller-manager-6c758d44dd-ll286" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.355999 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/95a83bad-f937-4350-91c3-f37000e465f8-client-ca\") pod \"route-controller-manager-74c5966c96-94dqz\" (UID: \"95a83bad-f937-4350-91c3-f37000e465f8\") " pod="openshift-route-controller-manager/route-controller-manager-74c5966c96-94dqz" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.356018 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95a83bad-f937-4350-91c3-f37000e465f8-serving-cert\") pod \"route-controller-manager-74c5966c96-94dqz\" (UID: \"95a83bad-f937-4350-91c3-f37000e465f8\") " pod="openshift-route-controller-manager/route-controller-manager-74c5966c96-94dqz" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.356046 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0451d31d-3c1b-4a85-8437-5c40000f3792-config\") pod \"controller-manager-6c758d44dd-ll286\" (UID: \"0451d31d-3c1b-4a85-8437-5c40000f3792\") " pod="openshift-controller-manager/controller-manager-6c758d44dd-ll286" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.356071 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0451d31d-3c1b-4a85-8437-5c40000f3792-proxy-ca-bundles\") pod \"controller-manager-6c758d44dd-ll286\" (UID: \"0451d31d-3c1b-4a85-8437-5c40000f3792\") " pod="openshift-controller-manager/controller-manager-6c758d44dd-ll286" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.357371 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0451d31d-3c1b-4a85-8437-5c40000f3792-client-ca\") pod \"controller-manager-6c758d44dd-ll286\" (UID: \"0451d31d-3c1b-4a85-8437-5c40000f3792\") " pod="openshift-controller-manager/controller-manager-6c758d44dd-ll286" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.357381 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/95a83bad-f937-4350-91c3-f37000e465f8-client-ca\") pod \"route-controller-manager-74c5966c96-94dqz\" (UID: \"95a83bad-f937-4350-91c3-f37000e465f8\") " pod="openshift-route-controller-manager/route-controller-manager-74c5966c96-94dqz" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.357452 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0451d31d-3c1b-4a85-8437-5c40000f3792-proxy-ca-bundles\") pod \"controller-manager-6c758d44dd-ll286\" (UID: \"0451d31d-3c1b-4a85-8437-5c40000f3792\") " pod="openshift-controller-manager/controller-manager-6c758d44dd-ll286" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.357552 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95a83bad-f937-4350-91c3-f37000e465f8-config\") pod \"route-controller-manager-74c5966c96-94dqz\" (UID: \"95a83bad-f937-4350-91c3-f37000e465f8\") " pod="openshift-route-controller-manager/route-controller-manager-74c5966c96-94dqz" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.357845 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0451d31d-3c1b-4a85-8437-5c40000f3792-config\") pod \"controller-manager-6c758d44dd-ll286\" (UID: \"0451d31d-3c1b-4a85-8437-5c40000f3792\") " pod="openshift-controller-manager/controller-manager-6c758d44dd-ll286" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.365259 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0451d31d-3c1b-4a85-8437-5c40000f3792-serving-cert\") pod \"controller-manager-6c758d44dd-ll286\" (UID: \"0451d31d-3c1b-4a85-8437-5c40000f3792\") " pod="openshift-controller-manager/controller-manager-6c758d44dd-ll286" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.365911 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95a83bad-f937-4350-91c3-f37000e465f8-serving-cert\") pod \"route-controller-manager-74c5966c96-94dqz\" (UID: \"95a83bad-f937-4350-91c3-f37000e465f8\") " pod="openshift-route-controller-manager/route-controller-manager-74c5966c96-94dqz" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.373026 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq9z5\" (UniqueName: \"kubernetes.io/projected/95a83bad-f937-4350-91c3-f37000e465f8-kube-api-access-cq9z5\") pod \"route-controller-manager-74c5966c96-94dqz\" (UID: \"95a83bad-f937-4350-91c3-f37000e465f8\") " pod="openshift-route-controller-manager/route-controller-manager-74c5966c96-94dqz" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.373729 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6klqx\" (UniqueName: \"kubernetes.io/projected/0451d31d-3c1b-4a85-8437-5c40000f3792-kube-api-access-6klqx\") pod \"controller-manager-6c758d44dd-ll286\" (UID: \"0451d31d-3c1b-4a85-8437-5c40000f3792\") " pod="openshift-controller-manager/controller-manager-6c758d44dd-ll286" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.419731 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gdzrh" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.419787 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gdzrh" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.453996 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gdzrh" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.460077 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74c5966c96-94dqz" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.474758 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c758d44dd-ll286" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.668036 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c758d44dd-ll286"] Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.675065 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c8g65" Feb 26 08:49:23 crc kubenswrapper[4925]: W0226 08:49:23.686075 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0451d31d_3c1b_4a85_8437_5c40000f3792.slice/crio-df60398bfbb26c374455ff86612ca627207f3acf3bbf3bde95982d31bc9cbb90 WatchSource:0}: Error finding container df60398bfbb26c374455ff86612ca627207f3acf3bbf3bde95982d31bc9cbb90: Status 404 returned error can't find the container with id df60398bfbb26c374455ff86612ca627207f3acf3bbf3bde95982d31bc9cbb90 Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.715036 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gdzrh" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.719593 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x9xkw" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.730154 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xgq9g" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.736907 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c8g65" Feb 26 08:49:23 crc kubenswrapper[4925]: I0226 08:49:23.943386 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74c5966c96-94dqz"] Feb 26 08:49:24 crc kubenswrapper[4925]: I0226 08:49:24.513741 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bad9fa6d-dc45-4aef-8380-5aa98401b840" path="/var/lib/kubelet/pods/bad9fa6d-dc45-4aef-8380-5aa98401b840/volumes" Feb 26 08:49:24 crc kubenswrapper[4925]: I0226 08:49:24.514576 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2a243e2-ace9-4720-af01-e657bc8fb8c8" path="/var/lib/kubelet/pods/d2a243e2-ace9-4720-af01-e657bc8fb8c8/volumes" Feb 26 08:49:24 crc kubenswrapper[4925]: I0226 08:49:24.664022 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c758d44dd-ll286" event={"ID":"0451d31d-3c1b-4a85-8437-5c40000f3792","Type":"ContainerStarted","Data":"7b7a780beee7aae91023a9d2719ed5160d4675ff5786ff7321f3ed5a63cffc82"} Feb 26 08:49:24 crc kubenswrapper[4925]: I0226 08:49:24.664065 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c758d44dd-ll286" event={"ID":"0451d31d-3c1b-4a85-8437-5c40000f3792","Type":"ContainerStarted","Data":"df60398bfbb26c374455ff86612ca627207f3acf3bbf3bde95982d31bc9cbb90"} Feb 26 08:49:24 crc kubenswrapper[4925]: I0226 08:49:24.665062 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c758d44dd-ll286" Feb 26 08:49:24 crc kubenswrapper[4925]: I0226 08:49:24.666728 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74c5966c96-94dqz" event={"ID":"95a83bad-f937-4350-91c3-f37000e465f8","Type":"ContainerStarted","Data":"88c7053ef7b6ff85d51364cd4bbf8357c201eebaab5f6117a694f8fd94bb1714"} Feb 26 08:49:24 crc kubenswrapper[4925]: I0226 08:49:24.666760 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74c5966c96-94dqz" event={"ID":"95a83bad-f937-4350-91c3-f37000e465f8","Type":"ContainerStarted","Data":"902c1c49e48798423ee6db1af0bdcc8dea0a227bcf1caeb5626db931cb682c1e"} Feb 26 08:49:24 crc kubenswrapper[4925]: I0226 08:49:24.668835 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c758d44dd-ll286" Feb 26 08:49:24 crc kubenswrapper[4925]: I0226 08:49:24.682030 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c758d44dd-ll286" podStartSLOduration=3.6820128949999997 podStartE2EDuration="3.682012895s" podCreationTimestamp="2026-02-26 08:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:49:24.679848632 +0000 UTC m=+296.819422925" watchObservedRunningTime="2026-02-26 08:49:24.682012895 +0000 UTC m=+296.821587188" Feb 26 08:49:24 crc kubenswrapper[4925]: I0226 08:49:24.723498 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-74c5966c96-94dqz" podStartSLOduration=3.723480123 podStartE2EDuration="3.723480123s" podCreationTimestamp="2026-02-26 08:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:49:24.720251804 +0000 UTC m=+296.859826107" watchObservedRunningTime="2026-02-26 08:49:24.723480123 +0000 UTC m=+296.863054406" Feb 26 08:49:25 crc kubenswrapper[4925]: I0226 08:49:25.101432 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mrq8g" Feb 26 08:49:25 crc kubenswrapper[4925]: I0226 08:49:25.101849 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mrq8g" Feb 26 08:49:25 crc kubenswrapper[4925]: I0226 08:49:25.147635 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mrq8g" Feb 26 08:49:25 crc kubenswrapper[4925]: I0226 08:49:25.533933 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7pkxc" Feb 26 08:49:25 crc kubenswrapper[4925]: I0226 08:49:25.534317 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7pkxc" Feb 26 08:49:25 crc kubenswrapper[4925]: I0226 08:49:25.573746 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gdzrh"] Feb 26 08:49:25 crc kubenswrapper[4925]: I0226 08:49:25.576030 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7pkxc" Feb 26 08:49:25 crc kubenswrapper[4925]: I0226 08:49:25.672237 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gdzrh" podUID="78d95f85-01b1-4a6b-8c48-b477f840035d" containerName="registry-server" containerID="cri-o://d451798681893aeb29c419ec492cd519b02b9b4b7be97f5101faca564df8f70a" gracePeriod=2 Feb 26 08:49:25 crc kubenswrapper[4925]: I0226 08:49:25.673421 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-74c5966c96-94dqz" Feb 26 08:49:25 crc kubenswrapper[4925]: I0226 08:49:25.679484 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-74c5966c96-94dqz" Feb 26 08:49:25 crc kubenswrapper[4925]: I0226 08:49:25.728919 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mrq8g" Feb 26 08:49:25 crc kubenswrapper[4925]: I0226 08:49:25.730589 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7pkxc" Feb 26 08:49:25 crc kubenswrapper[4925]: I0226 08:49:25.779412 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c8g65"] Feb 26 08:49:25 crc kubenswrapper[4925]: I0226 08:49:25.779681 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c8g65" podUID="cf852adc-ded1-43ba-9ed4-cf31d6eda903" containerName="registry-server" containerID="cri-o://50cb856ad17a717d1c908ab7d22a6429ce6ae6bd07ed3e2c2fc82feb4e023258" gracePeriod=2 Feb 26 08:49:26 crc kubenswrapper[4925]: I0226 08:49:26.685448 4925 generic.go:334] "Generic (PLEG): container finished" podID="cf852adc-ded1-43ba-9ed4-cf31d6eda903" containerID="50cb856ad17a717d1c908ab7d22a6429ce6ae6bd07ed3e2c2fc82feb4e023258" exitCode=0 Feb 26 08:49:26 crc kubenswrapper[4925]: I0226 08:49:26.685563 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8g65" event={"ID":"cf852adc-ded1-43ba-9ed4-cf31d6eda903","Type":"ContainerDied","Data":"50cb856ad17a717d1c908ab7d22a6429ce6ae6bd07ed3e2c2fc82feb4e023258"} Feb 26 08:49:26 crc kubenswrapper[4925]: I0226 08:49:26.688115 4925 generic.go:334] "Generic (PLEG): container finished" podID="78d95f85-01b1-4a6b-8c48-b477f840035d" containerID="d451798681893aeb29c419ec492cd519b02b9b4b7be97f5101faca564df8f70a" exitCode=0 Feb 26 08:49:26 crc kubenswrapper[4925]: I0226 08:49:26.688136 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdzrh" event={"ID":"78d95f85-01b1-4a6b-8c48-b477f840035d","Type":"ContainerDied","Data":"d451798681893aeb29c419ec492cd519b02b9b4b7be97f5101faca564df8f70a"} Feb 26 08:49:26 crc kubenswrapper[4925]: I0226 08:49:26.900050 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8g65" Feb 26 08:49:26 crc kubenswrapper[4925]: I0226 08:49:26.976110 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gdzrh" Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.013617 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n72qr\" (UniqueName: \"kubernetes.io/projected/cf852adc-ded1-43ba-9ed4-cf31d6eda903-kube-api-access-n72qr\") pod \"cf852adc-ded1-43ba-9ed4-cf31d6eda903\" (UID: \"cf852adc-ded1-43ba-9ed4-cf31d6eda903\") " Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.013906 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf852adc-ded1-43ba-9ed4-cf31d6eda903-catalog-content\") pod \"cf852adc-ded1-43ba-9ed4-cf31d6eda903\" (UID: \"cf852adc-ded1-43ba-9ed4-cf31d6eda903\") " Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.014036 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf852adc-ded1-43ba-9ed4-cf31d6eda903-utilities\") pod \"cf852adc-ded1-43ba-9ed4-cf31d6eda903\" (UID: \"cf852adc-ded1-43ba-9ed4-cf31d6eda903\") " Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.014658 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf852adc-ded1-43ba-9ed4-cf31d6eda903-utilities" (OuterVolumeSpecName: "utilities") pod "cf852adc-ded1-43ba-9ed4-cf31d6eda903" (UID: "cf852adc-ded1-43ba-9ed4-cf31d6eda903"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.015018 4925 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf852adc-ded1-43ba-9ed4-cf31d6eda903-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.028513 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf852adc-ded1-43ba-9ed4-cf31d6eda903-kube-api-access-n72qr" (OuterVolumeSpecName: "kube-api-access-n72qr") pod "cf852adc-ded1-43ba-9ed4-cf31d6eda903" (UID: "cf852adc-ded1-43ba-9ed4-cf31d6eda903"). InnerVolumeSpecName "kube-api-access-n72qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.076167 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf852adc-ded1-43ba-9ed4-cf31d6eda903-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf852adc-ded1-43ba-9ed4-cf31d6eda903" (UID: "cf852adc-ded1-43ba-9ed4-cf31d6eda903"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.116072 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78d95f85-01b1-4a6b-8c48-b477f840035d-catalog-content\") pod \"78d95f85-01b1-4a6b-8c48-b477f840035d\" (UID: \"78d95f85-01b1-4a6b-8c48-b477f840035d\") " Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.116231 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj2zh\" (UniqueName: \"kubernetes.io/projected/78d95f85-01b1-4a6b-8c48-b477f840035d-kube-api-access-cj2zh\") pod \"78d95f85-01b1-4a6b-8c48-b477f840035d\" (UID: \"78d95f85-01b1-4a6b-8c48-b477f840035d\") " Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.116388 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78d95f85-01b1-4a6b-8c48-b477f840035d-utilities\") pod \"78d95f85-01b1-4a6b-8c48-b477f840035d\" (UID: \"78d95f85-01b1-4a6b-8c48-b477f840035d\") " Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.116759 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n72qr\" (UniqueName: \"kubernetes.io/projected/cf852adc-ded1-43ba-9ed4-cf31d6eda903-kube-api-access-n72qr\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.116790 4925 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf852adc-ded1-43ba-9ed4-cf31d6eda903-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.117785 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78d95f85-01b1-4a6b-8c48-b477f840035d-utilities" (OuterVolumeSpecName: "utilities") pod "78d95f85-01b1-4a6b-8c48-b477f840035d" (UID: "78d95f85-01b1-4a6b-8c48-b477f840035d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.122754 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78d95f85-01b1-4a6b-8c48-b477f840035d-kube-api-access-cj2zh" (OuterVolumeSpecName: "kube-api-access-cj2zh") pod "78d95f85-01b1-4a6b-8c48-b477f840035d" (UID: "78d95f85-01b1-4a6b-8c48-b477f840035d"). InnerVolumeSpecName "kube-api-access-cj2zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.183585 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78d95f85-01b1-4a6b-8c48-b477f840035d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78d95f85-01b1-4a6b-8c48-b477f840035d" (UID: "78d95f85-01b1-4a6b-8c48-b477f840035d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.218144 4925 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78d95f85-01b1-4a6b-8c48-b477f840035d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.218183 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj2zh\" (UniqueName: \"kubernetes.io/projected/78d95f85-01b1-4a6b-8c48-b477f840035d-kube-api-access-cj2zh\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.218197 4925 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78d95f85-01b1-4a6b-8c48-b477f840035d-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.696923 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8g65" event={"ID":"cf852adc-ded1-43ba-9ed4-cf31d6eda903","Type":"ContainerDied","Data":"dbb0929885aa353ae66af1253dd8a209b5885d1362a9fad13ad620654abe7141"} Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.696984 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8g65" Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.697012 4925 scope.go:117] "RemoveContainer" containerID="50cb856ad17a717d1c908ab7d22a6429ce6ae6bd07ed3e2c2fc82feb4e023258" Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.701716 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gdzrh" event={"ID":"78d95f85-01b1-4a6b-8c48-b477f840035d","Type":"ContainerDied","Data":"498efbb87d26ef94b618f671b5a5c7cafc2ebf24ea483183bb730ee7a3980603"} Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.701877 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gdzrh" Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.719224 4925 scope.go:117] "RemoveContainer" containerID="5d4454d771c799ec5afdfc7236931be8006d5ef28b9c2f1cfea9e96b5c49e4bf" Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.737916 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c8g65"] Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.741900 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c8g65"] Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.763060 4925 scope.go:117] "RemoveContainer" containerID="59a9b95829a6337e61ff88f289834a710d7a9f8d8df3e220e0b3574b15cdb996" Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.783703 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gdzrh"] Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.789168 4925 scope.go:117] "RemoveContainer" containerID="d451798681893aeb29c419ec492cd519b02b9b4b7be97f5101faca564df8f70a" Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.790017 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gdzrh"] Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.804865 4925 scope.go:117] "RemoveContainer" containerID="d5c5033d869531d932c4966861ba74c3154e78ca0a4108ad26b782187f151b02" Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.819408 4925 scope.go:117] "RemoveContainer" containerID="e717369ee2e3b75e644cb9877ce06bc29cdde42e69226b6b196e632d71fdc7d5" Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.974487 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pkxc"] Feb 26 08:49:27 crc kubenswrapper[4925]: I0226 08:49:27.974765 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7pkxc" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" containerName="registry-server" containerID="cri-o://69ed512e9218b36df073edbe72da71baa2eec9a550b8d7dd3ed93879d034b9bc" gracePeriod=2 Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.360284 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" podUID="c637160e-e623-415a-b0e4-9277dee94439" containerName="oauth-openshift" containerID="cri-o://3eabaa6d5f3739ecb2b9f1bdb0150e374dd132c0efdaad7834205e78db9f8463" gracePeriod=15 Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.516775 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78d95f85-01b1-4a6b-8c48-b477f840035d" path="/var/lib/kubelet/pods/78d95f85-01b1-4a6b-8c48-b477f840035d/volumes" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.517822 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf852adc-ded1-43ba-9ed4-cf31d6eda903" path="/var/lib/kubelet/pods/cf852adc-ded1-43ba-9ed4-cf31d6eda903/volumes" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.547585 4925 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 26 08:49:28 crc kubenswrapper[4925]: E0226 08:49:28.547839 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf852adc-ded1-43ba-9ed4-cf31d6eda903" containerName="registry-server" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.547852 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf852adc-ded1-43ba-9ed4-cf31d6eda903" containerName="registry-server" Feb 26 08:49:28 crc kubenswrapper[4925]: E0226 08:49:28.547866 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d95f85-01b1-4a6b-8c48-b477f840035d" containerName="extract-content" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.547872 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d95f85-01b1-4a6b-8c48-b477f840035d" containerName="extract-content" Feb 26 08:49:28 crc kubenswrapper[4925]: E0226 08:49:28.547886 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d95f85-01b1-4a6b-8c48-b477f840035d" containerName="extract-utilities" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.547892 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d95f85-01b1-4a6b-8c48-b477f840035d" containerName="extract-utilities" Feb 26 08:49:28 crc kubenswrapper[4925]: E0226 08:49:28.547905 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf852adc-ded1-43ba-9ed4-cf31d6eda903" containerName="extract-utilities" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.547910 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf852adc-ded1-43ba-9ed4-cf31d6eda903" containerName="extract-utilities" Feb 26 08:49:28 crc kubenswrapper[4925]: E0226 08:49:28.547918 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d95f85-01b1-4a6b-8c48-b477f840035d" containerName="registry-server" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.547926 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d95f85-01b1-4a6b-8c48-b477f840035d" containerName="registry-server" Feb 26 08:49:28 crc kubenswrapper[4925]: E0226 08:49:28.547935 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf852adc-ded1-43ba-9ed4-cf31d6eda903" containerName="extract-content" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.547942 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf852adc-ded1-43ba-9ed4-cf31d6eda903" containerName="extract-content" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.548034 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf852adc-ded1-43ba-9ed4-cf31d6eda903" containerName="registry-server" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.548045 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="78d95f85-01b1-4a6b-8c48-b477f840035d" containerName="registry-server" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.548515 4925 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.548664 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.548792 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4" gracePeriod=15 Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.549270 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://9bd4a14b45912a61145d72aaa6b5e181253b68ba75d012acaa27caa7f4cf7460" gracePeriod=15 Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.549311 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095" gracePeriod=15 Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.549348 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e" gracePeriod=15 Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.549379 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18" gracePeriod=15 Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.550068 4925 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 08:49:28 crc kubenswrapper[4925]: E0226 08:49:28.550448 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.550474 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 26 08:49:28 crc kubenswrapper[4925]: E0226 08:49:28.550486 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.550497 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 26 08:49:28 crc kubenswrapper[4925]: E0226 08:49:28.550515 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.550541 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 08:49:28 crc kubenswrapper[4925]: E0226 08:49:28.550552 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.550559 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 26 08:49:28 crc kubenswrapper[4925]: E0226 08:49:28.550571 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.550579 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 26 08:49:28 crc kubenswrapper[4925]: E0226 08:49:28.550589 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.550595 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 08:49:28 crc kubenswrapper[4925]: E0226 08:49:28.550608 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.550614 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 08:49:28 crc kubenswrapper[4925]: E0226 08:49:28.550623 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.550629 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 08:49:28 crc kubenswrapper[4925]: E0226 08:49:28.550642 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.550649 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.556173 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.556244 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.556259 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.556270 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.556291 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.556299 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.556324 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.556339 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 26 08:49:28 crc kubenswrapper[4925]: E0226 08:49:28.562627 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.562656 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.563170 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 08:49:28 crc kubenswrapper[4925]: E0226 08:49:28.609237 4925 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.136:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.637073 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.637186 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.637227 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.637284 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.637340 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.659319 4925 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.659590 4925 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.713889 4925 generic.go:334] "Generic (PLEG): container finished" podID="c637160e-e623-415a-b0e4-9277dee94439" containerID="3eabaa6d5f3739ecb2b9f1bdb0150e374dd132c0efdaad7834205e78db9f8463" exitCode=0 Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.713958 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" event={"ID":"c637160e-e623-415a-b0e4-9277dee94439","Type":"ContainerDied","Data":"3eabaa6d5f3739ecb2b9f1bdb0150e374dd132c0efdaad7834205e78db9f8463"} Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.716013 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.717245 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.717900 4925 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9bd4a14b45912a61145d72aaa6b5e181253b68ba75d012acaa27caa7f4cf7460" exitCode=0 Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.717915 4925 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095" exitCode=0 Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.717941 4925 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e" exitCode=0 Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.717948 4925 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18" exitCode=2 Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.718021 4925 scope.go:117] "RemoveContainer" containerID="450b3e40b5931a115663540d1dc7eefac9d3f471dfd5d77a08bf5d18db5b2011" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.722582 4925 generic.go:334] "Generic (PLEG): container finished" podID="e99e124b-4917-49df-8c28-04373cfed9d3" containerID="69ed512e9218b36df073edbe72da71baa2eec9a550b8d7dd3ed93879d034b9bc" exitCode=0 Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.722628 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pkxc" event={"ID":"e99e124b-4917-49df-8c28-04373cfed9d3","Type":"ContainerDied","Data":"69ed512e9218b36df073edbe72da71baa2eec9a550b8d7dd3ed93879d034b9bc"} Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.738749 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.738904 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.738969 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.739039 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.739041 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.739069 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.739127 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.739138 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.739193 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.739222 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.739299 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.739390 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.739440 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.818360 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.818976 4925 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.819345 4925 status_manager.go:851] "Failed to get status for pod" podUID="c637160e-e623-415a-b0e4-9277dee94439" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-btcxz\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.840044 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.840085 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.840105 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.840119 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.840179 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.840212 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.911307 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.911740 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pkxc" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.912154 4925 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.912371 4925 status_manager.go:851] "Failed to get status for pod" podUID="c637160e-e623-415a-b0e4-9277dee94439" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-btcxz\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.912580 4925 status_manager.go:851] "Failed to get status for pod" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" pod="openshift-marketplace/redhat-marketplace-7pkxc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-7pkxc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:28 crc kubenswrapper[4925]: W0226 08:49:28.930431 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-129eb4a6299fa3f1929e749f669a36a26eecacd2de4b8e89e4c19865db1268ef WatchSource:0}: Error finding container 129eb4a6299fa3f1929e749f669a36a26eecacd2de4b8e89e4c19865db1268ef: Status 404 returned error can't find the container with id 129eb4a6299fa3f1929e749f669a36a26eecacd2de4b8e89e4c19865db1268ef Feb 26 08:49:28 crc kubenswrapper[4925]: E0226 08:49:28.933466 4925 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.136:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1897bfb67f2edebe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:49:28.932310718 +0000 UTC m=+301.071885001,LastTimestamp:2026-02-26 08:49:28.932310718 +0000 UTC m=+301.071885001,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.941893 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-serving-cert\") pod \"c637160e-e623-415a-b0e4-9277dee94439\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.942068 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gcl4\" (UniqueName: \"kubernetes.io/projected/c637160e-e623-415a-b0e4-9277dee94439-kube-api-access-4gcl4\") pod \"c637160e-e623-415a-b0e4-9277dee94439\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.942122 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c637160e-e623-415a-b0e4-9277dee94439-audit-dir\") pod \"c637160e-e623-415a-b0e4-9277dee94439\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.942171 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-template-login\") pod \"c637160e-e623-415a-b0e4-9277dee94439\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.942200 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-ocp-branding-template\") pod \"c637160e-e623-415a-b0e4-9277dee94439\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.942224 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-template-provider-selection\") pod \"c637160e-e623-415a-b0e4-9277dee94439\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.942240 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-session\") pod \"c637160e-e623-415a-b0e4-9277dee94439\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.942261 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-router-certs\") pod \"c637160e-e623-415a-b0e4-9277dee94439\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.942294 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-audit-policies\") pod \"c637160e-e623-415a-b0e4-9277dee94439\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.942345 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-cliconfig\") pod \"c637160e-e623-415a-b0e4-9277dee94439\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.942369 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-service-ca\") pod \"c637160e-e623-415a-b0e4-9277dee94439\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.942385 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-trusted-ca-bundle\") pod \"c637160e-e623-415a-b0e4-9277dee94439\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.942400 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-template-error\") pod \"c637160e-e623-415a-b0e4-9277dee94439\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.942416 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-idp-0-file-data\") pod \"c637160e-e623-415a-b0e4-9277dee94439\" (UID: \"c637160e-e623-415a-b0e4-9277dee94439\") " Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.943025 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c637160e-e623-415a-b0e4-9277dee94439-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "c637160e-e623-415a-b0e4-9277dee94439" (UID: "c637160e-e623-415a-b0e4-9277dee94439"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.943472 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "c637160e-e623-415a-b0e4-9277dee94439" (UID: "c637160e-e623-415a-b0e4-9277dee94439"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.943519 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "c637160e-e623-415a-b0e4-9277dee94439" (UID: "c637160e-e623-415a-b0e4-9277dee94439"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.943975 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "c637160e-e623-415a-b0e4-9277dee94439" (UID: "c637160e-e623-415a-b0e4-9277dee94439"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.945258 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "c637160e-e623-415a-b0e4-9277dee94439" (UID: "c637160e-e623-415a-b0e4-9277dee94439"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.945729 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "c637160e-e623-415a-b0e4-9277dee94439" (UID: "c637160e-e623-415a-b0e4-9277dee94439"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.946739 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "c637160e-e623-415a-b0e4-9277dee94439" (UID: "c637160e-e623-415a-b0e4-9277dee94439"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.946831 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "c637160e-e623-415a-b0e4-9277dee94439" (UID: "c637160e-e623-415a-b0e4-9277dee94439"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.946975 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "c637160e-e623-415a-b0e4-9277dee94439" (UID: "c637160e-e623-415a-b0e4-9277dee94439"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.947600 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "c637160e-e623-415a-b0e4-9277dee94439" (UID: "c637160e-e623-415a-b0e4-9277dee94439"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.947645 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "c637160e-e623-415a-b0e4-9277dee94439" (UID: "c637160e-e623-415a-b0e4-9277dee94439"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.948353 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "c637160e-e623-415a-b0e4-9277dee94439" (UID: "c637160e-e623-415a-b0e4-9277dee94439"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.950023 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "c637160e-e623-415a-b0e4-9277dee94439" (UID: "c637160e-e623-415a-b0e4-9277dee94439"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:49:28 crc kubenswrapper[4925]: I0226 08:49:28.950804 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c637160e-e623-415a-b0e4-9277dee94439-kube-api-access-4gcl4" (OuterVolumeSpecName: "kube-api-access-4gcl4") pod "c637160e-e623-415a-b0e4-9277dee94439" (UID: "c637160e-e623-415a-b0e4-9277dee94439"). InnerVolumeSpecName "kube-api-access-4gcl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.043571 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2h9v\" (UniqueName: \"kubernetes.io/projected/e99e124b-4917-49df-8c28-04373cfed9d3-kube-api-access-n2h9v\") pod \"e99e124b-4917-49df-8c28-04373cfed9d3\" (UID: \"e99e124b-4917-49df-8c28-04373cfed9d3\") " Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.043655 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e99e124b-4917-49df-8c28-04373cfed9d3-catalog-content\") pod \"e99e124b-4917-49df-8c28-04373cfed9d3\" (UID: \"e99e124b-4917-49df-8c28-04373cfed9d3\") " Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.043678 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e99e124b-4917-49df-8c28-04373cfed9d3-utilities\") pod \"e99e124b-4917-49df-8c28-04373cfed9d3\" (UID: \"e99e124b-4917-49df-8c28-04373cfed9d3\") " Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.043979 4925 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.044002 4925 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.044028 4925 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.044042 4925 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.044062 4925 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.044091 4925 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.044103 4925 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.044116 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gcl4\" (UniqueName: \"kubernetes.io/projected/c637160e-e623-415a-b0e4-9277dee94439-kube-api-access-4gcl4\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.044128 4925 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c637160e-e623-415a-b0e4-9277dee94439-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.044138 4925 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.044150 4925 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.044162 4925 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.044176 4925 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.044189 4925 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c637160e-e623-415a-b0e4-9277dee94439-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.045541 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e99e124b-4917-49df-8c28-04373cfed9d3-utilities" (OuterVolumeSpecName: "utilities") pod "e99e124b-4917-49df-8c28-04373cfed9d3" (UID: "e99e124b-4917-49df-8c28-04373cfed9d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.047863 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e99e124b-4917-49df-8c28-04373cfed9d3-kube-api-access-n2h9v" (OuterVolumeSpecName: "kube-api-access-n2h9v") pod "e99e124b-4917-49df-8c28-04373cfed9d3" (UID: "e99e124b-4917-49df-8c28-04373cfed9d3"). InnerVolumeSpecName "kube-api-access-n2h9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.067063 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e99e124b-4917-49df-8c28-04373cfed9d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e99e124b-4917-49df-8c28-04373cfed9d3" (UID: "e99e124b-4917-49df-8c28-04373cfed9d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.145084 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2h9v\" (UniqueName: \"kubernetes.io/projected/e99e124b-4917-49df-8c28-04373cfed9d3-kube-api-access-n2h9v\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.145127 4925 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e99e124b-4917-49df-8c28-04373cfed9d3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.145138 4925 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e99e124b-4917-49df-8c28-04373cfed9d3-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.728703 4925 generic.go:334] "Generic (PLEG): container finished" podID="a11f03c1-788a-4e80-aaf8-02dd3a150814" containerID="af642ad749fa65a4f2db241aaa17dfb27beb28397b85424ebf4848346715849b" exitCode=0 Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.728815 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a11f03c1-788a-4e80-aaf8-02dd3a150814","Type":"ContainerDied","Data":"af642ad749fa65a4f2db241aaa17dfb27beb28397b85424ebf4848346715849b"} Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.731594 4925 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.732268 4925 status_manager.go:851] "Failed to get status for pod" podUID="c637160e-e623-415a-b0e4-9277dee94439" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-btcxz\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.732807 4925 status_manager.go:851] "Failed to get status for pod" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" pod="openshift-marketplace/redhat-marketplace-7pkxc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-7pkxc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.733072 4925 status_manager.go:851] "Failed to get status for pod" podUID="a11f03c1-788a-4e80-aaf8-02dd3a150814" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.733381 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c3e875887d3713fbbe41e0d57043fbc94df80d6590717f0b11fd3477674162c8"} Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.733622 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"129eb4a6299fa3f1929e749f669a36a26eecacd2de4b8e89e4c19865db1268ef"} Feb 26 08:49:29 crc kubenswrapper[4925]: E0226 08:49:29.734127 4925 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.136:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.734563 4925 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.734965 4925 status_manager.go:851] "Failed to get status for pod" podUID="c637160e-e623-415a-b0e4-9277dee94439" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-btcxz\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.735283 4925 status_manager.go:851] "Failed to get status for pod" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" pod="openshift-marketplace/redhat-marketplace-7pkxc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-7pkxc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.735578 4925 status_manager.go:851] "Failed to get status for pod" podUID="a11f03c1-788a-4e80-aaf8-02dd3a150814" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.737080 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7pkxc" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.737711 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7pkxc" event={"ID":"e99e124b-4917-49df-8c28-04373cfed9d3","Type":"ContainerDied","Data":"a4d60917cc86ec24cba0908eaaa8c99c33bd776600af08cec7c88cf007a268cd"} Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.737844 4925 scope.go:117] "RemoveContainer" containerID="69ed512e9218b36df073edbe72da71baa2eec9a550b8d7dd3ed93879d034b9bc" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.738375 4925 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.738678 4925 status_manager.go:851] "Failed to get status for pod" podUID="c637160e-e623-415a-b0e4-9277dee94439" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-btcxz\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.738954 4925 status_manager.go:851] "Failed to get status for pod" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" pod="openshift-marketplace/redhat-marketplace-7pkxc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-7pkxc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.739187 4925 status_manager.go:851] "Failed to get status for pod" podUID="a11f03c1-788a-4e80-aaf8-02dd3a150814" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.741803 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.744579 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" event={"ID":"c637160e-e623-415a-b0e4-9277dee94439","Type":"ContainerDied","Data":"956ea3fd7df4dad1d7ec15e6538fd0c1d13feade21f6f9a3a84aca649e20fed7"} Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.744679 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.745302 4925 status_manager.go:851] "Failed to get status for pod" podUID="c637160e-e623-415a-b0e4-9277dee94439" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-btcxz\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.745728 4925 status_manager.go:851] "Failed to get status for pod" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" pod="openshift-marketplace/redhat-marketplace-7pkxc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-7pkxc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.746327 4925 status_manager.go:851] "Failed to get status for pod" podUID="a11f03c1-788a-4e80-aaf8-02dd3a150814" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.746601 4925 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.754421 4925 scope.go:117] "RemoveContainer" containerID="306e49a357fd31d97ca234ad4c1f297628844dad259f2ae2e16b99759749d51b" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.757045 4925 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.757554 4925 status_manager.go:851] "Failed to get status for pod" podUID="c637160e-e623-415a-b0e4-9277dee94439" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-btcxz\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.757848 4925 status_manager.go:851] "Failed to get status for pod" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" pod="openshift-marketplace/redhat-marketplace-7pkxc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-7pkxc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.758200 4925 status_manager.go:851] "Failed to get status for pod" podUID="a11f03c1-788a-4e80-aaf8-02dd3a150814" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.770381 4925 scope.go:117] "RemoveContainer" containerID="af9d0b4d89aa903b43b9de531519739fcc789d4e9c864117ec46bca03a24e8bd" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.783131 4925 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.783352 4925 status_manager.go:851] "Failed to get status for pod" podUID="c637160e-e623-415a-b0e4-9277dee94439" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-btcxz\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.783550 4925 status_manager.go:851] "Failed to get status for pod" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" pod="openshift-marketplace/redhat-marketplace-7pkxc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-7pkxc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.783709 4925 status_manager.go:851] "Failed to get status for pod" podUID="a11f03c1-788a-4e80-aaf8-02dd3a150814" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:29 crc kubenswrapper[4925]: I0226 08:49:29.808508 4925 scope.go:117] "RemoveContainer" containerID="3eabaa6d5f3739ecb2b9f1bdb0150e374dd132c0efdaad7834205e78db9f8463" Feb 26 08:49:30 crc kubenswrapper[4925]: E0226 08:49:30.766094 4925 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.136:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 08:49:30 crc kubenswrapper[4925]: W0226 08:49:30.799066 4925 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-conmon-617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4.scope/cpu.max": read /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-conmon-617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4.scope/cpu.max: no such device Feb 26 08:49:30 crc kubenswrapper[4925]: E0226 08:49:30.837594 4925 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4.scope\": RecentStats: unable to find data in memory cache]" Feb 26 08:49:30 crc kubenswrapper[4925]: I0226 08:49:30.937250 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 08:49:30 crc kubenswrapper[4925]: I0226 08:49:30.937920 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:49:30 crc kubenswrapper[4925]: I0226 08:49:30.938815 4925 status_manager.go:851] "Failed to get status for pod" podUID="a11f03c1-788a-4e80-aaf8-02dd3a150814" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:30 crc kubenswrapper[4925]: I0226 08:49:30.938995 4925 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:30 crc kubenswrapper[4925]: I0226 08:49:30.939178 4925 status_manager.go:851] "Failed to get status for pod" podUID="c637160e-e623-415a-b0e4-9277dee94439" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-btcxz\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:30 crc kubenswrapper[4925]: I0226 08:49:30.939414 4925 status_manager.go:851] "Failed to get status for pod" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" pod="openshift-marketplace/redhat-marketplace-7pkxc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-7pkxc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.073750 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.073848 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.073917 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.073929 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.073973 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.074008 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.074341 4925 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.074369 4925 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.074389 4925 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.122000 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.122766 4925 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.123252 4925 status_manager.go:851] "Failed to get status for pod" podUID="c637160e-e623-415a-b0e4-9277dee94439" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-btcxz\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.123893 4925 status_manager.go:851] "Failed to get status for pod" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" pod="openshift-marketplace/redhat-marketplace-7pkxc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-7pkxc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.124667 4925 status_manager.go:851] "Failed to get status for pod" podUID="a11f03c1-788a-4e80-aaf8-02dd3a150814" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.277335 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a11f03c1-788a-4e80-aaf8-02dd3a150814-kube-api-access\") pod \"a11f03c1-788a-4e80-aaf8-02dd3a150814\" (UID: \"a11f03c1-788a-4e80-aaf8-02dd3a150814\") " Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.277464 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a11f03c1-788a-4e80-aaf8-02dd3a150814-kubelet-dir\") pod \"a11f03c1-788a-4e80-aaf8-02dd3a150814\" (UID: \"a11f03c1-788a-4e80-aaf8-02dd3a150814\") " Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.277574 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a11f03c1-788a-4e80-aaf8-02dd3a150814-var-lock\") pod \"a11f03c1-788a-4e80-aaf8-02dd3a150814\" (UID: \"a11f03c1-788a-4e80-aaf8-02dd3a150814\") " Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.277657 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a11f03c1-788a-4e80-aaf8-02dd3a150814-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a11f03c1-788a-4e80-aaf8-02dd3a150814" (UID: "a11f03c1-788a-4e80-aaf8-02dd3a150814"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.277766 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a11f03c1-788a-4e80-aaf8-02dd3a150814-var-lock" (OuterVolumeSpecName: "var-lock") pod "a11f03c1-788a-4e80-aaf8-02dd3a150814" (UID: "a11f03c1-788a-4e80-aaf8-02dd3a150814"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.278211 4925 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a11f03c1-788a-4e80-aaf8-02dd3a150814-var-lock\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.278238 4925 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a11f03c1-788a-4e80-aaf8-02dd3a150814-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.284675 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a11f03c1-788a-4e80-aaf8-02dd3a150814-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a11f03c1-788a-4e80-aaf8-02dd3a150814" (UID: "a11f03c1-788a-4e80-aaf8-02dd3a150814"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.379965 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a11f03c1-788a-4e80-aaf8-02dd3a150814-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.771149 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.772295 4925 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4" exitCode=0 Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.772373 4925 scope.go:117] "RemoveContainer" containerID="9bd4a14b45912a61145d72aaa6b5e181253b68ba75d012acaa27caa7f4cf7460" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.772574 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.773909 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a11f03c1-788a-4e80-aaf8-02dd3a150814","Type":"ContainerDied","Data":"09b5e059ece7f456508de29920466d133bd4de0e37ddec0ee3fa4ee203320e66"} Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.773937 4925 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09b5e059ece7f456508de29920466d133bd4de0e37ddec0ee3fa4ee203320e66" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.774155 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.788399 4925 scope.go:117] "RemoveContainer" containerID="e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.791078 4925 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.791348 4925 status_manager.go:851] "Failed to get status for pod" podUID="c637160e-e623-415a-b0e4-9277dee94439" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-btcxz\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.791752 4925 status_manager.go:851] "Failed to get status for pod" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" pod="openshift-marketplace/redhat-marketplace-7pkxc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-7pkxc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.792082 4925 status_manager.go:851] "Failed to get status for pod" podUID="a11f03c1-788a-4e80-aaf8-02dd3a150814" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.797751 4925 status_manager.go:851] "Failed to get status for pod" podUID="a11f03c1-788a-4e80-aaf8-02dd3a150814" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.798077 4925 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.798377 4925 status_manager.go:851] "Failed to get status for pod" podUID="c637160e-e623-415a-b0e4-9277dee94439" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-btcxz\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.798742 4925 status_manager.go:851] "Failed to get status for pod" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" pod="openshift-marketplace/redhat-marketplace-7pkxc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-7pkxc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.803476 4925 scope.go:117] "RemoveContainer" containerID="f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.817617 4925 scope.go:117] "RemoveContainer" containerID="cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.832481 4925 scope.go:117] "RemoveContainer" containerID="617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.849235 4925 scope.go:117] "RemoveContainer" containerID="237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.867281 4925 scope.go:117] "RemoveContainer" containerID="9bd4a14b45912a61145d72aaa6b5e181253b68ba75d012acaa27caa7f4cf7460" Feb 26 08:49:31 crc kubenswrapper[4925]: E0226 08:49:31.867866 4925 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bd4a14b45912a61145d72aaa6b5e181253b68ba75d012acaa27caa7f4cf7460\": container with ID starting with 9bd4a14b45912a61145d72aaa6b5e181253b68ba75d012acaa27caa7f4cf7460 not found: ID does not exist" containerID="9bd4a14b45912a61145d72aaa6b5e181253b68ba75d012acaa27caa7f4cf7460" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.867954 4925 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bd4a14b45912a61145d72aaa6b5e181253b68ba75d012acaa27caa7f4cf7460"} err="failed to get container status \"9bd4a14b45912a61145d72aaa6b5e181253b68ba75d012acaa27caa7f4cf7460\": rpc error: code = NotFound desc = could not find container \"9bd4a14b45912a61145d72aaa6b5e181253b68ba75d012acaa27caa7f4cf7460\": container with ID starting with 9bd4a14b45912a61145d72aaa6b5e181253b68ba75d012acaa27caa7f4cf7460 not found: ID does not exist" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.868022 4925 scope.go:117] "RemoveContainer" containerID="e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095" Feb 26 08:49:31 crc kubenswrapper[4925]: E0226 08:49:31.868320 4925 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\": container with ID starting with e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095 not found: ID does not exist" containerID="e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.868349 4925 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095"} err="failed to get container status \"e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\": rpc error: code = NotFound desc = could not find container \"e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095\": container with ID starting with e165d278bfd47a790227d167668434628573cabcff13b31eb9675727e8fb1095 not found: ID does not exist" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.868367 4925 scope.go:117] "RemoveContainer" containerID="f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e" Feb 26 08:49:31 crc kubenswrapper[4925]: E0226 08:49:31.868680 4925 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\": container with ID starting with f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e not found: ID does not exist" containerID="f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.868723 4925 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e"} err="failed to get container status \"f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\": rpc error: code = NotFound desc = could not find container \"f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e\": container with ID starting with f0e47f2144efa42ac09e887260db5dc383881d218c2252ceea241f624f32ef5e not found: ID does not exist" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.868749 4925 scope.go:117] "RemoveContainer" containerID="cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18" Feb 26 08:49:31 crc kubenswrapper[4925]: E0226 08:49:31.870073 4925 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\": container with ID starting with cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18 not found: ID does not exist" containerID="cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.870120 4925 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18"} err="failed to get container status \"cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\": rpc error: code = NotFound desc = could not find container \"cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18\": container with ID starting with cf9af5a3eba34d7d855f4c479174aab76bcd7c5a55ec58a8b4fc7375192b1a18 not found: ID does not exist" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.870199 4925 scope.go:117] "RemoveContainer" containerID="617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4" Feb 26 08:49:31 crc kubenswrapper[4925]: E0226 08:49:31.870941 4925 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\": container with ID starting with 617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4 not found: ID does not exist" containerID="617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.870978 4925 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4"} err="failed to get container status \"617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\": rpc error: code = NotFound desc = could not find container \"617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4\": container with ID starting with 617eee9911dfe622e33dab8a621792aa6621ddcb38ad0989fb9270327f5ae6f4 not found: ID does not exist" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.870999 4925 scope.go:117] "RemoveContainer" containerID="237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01" Feb 26 08:49:31 crc kubenswrapper[4925]: E0226 08:49:31.871332 4925 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\": container with ID starting with 237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01 not found: ID does not exist" containerID="237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01" Feb 26 08:49:31 crc kubenswrapper[4925]: I0226 08:49:31.871361 4925 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01"} err="failed to get container status \"237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\": rpc error: code = NotFound desc = could not find container \"237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01\": container with ID starting with 237c71358672a61d2ec5f02a71efc3501552ba01f6998fa8098098ff10526a01 not found: ID does not exist" Feb 26 08:49:32 crc kubenswrapper[4925]: I0226 08:49:32.516845 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 26 08:49:33 crc kubenswrapper[4925]: E0226 08:49:33.150239 4925 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:33 crc kubenswrapper[4925]: E0226 08:49:33.150808 4925 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:33 crc kubenswrapper[4925]: E0226 08:49:33.151915 4925 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:33 crc kubenswrapper[4925]: E0226 08:49:33.152362 4925 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:33 crc kubenswrapper[4925]: E0226 08:49:33.153130 4925 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:33 crc kubenswrapper[4925]: I0226 08:49:33.153231 4925 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 26 08:49:33 crc kubenswrapper[4925]: E0226 08:49:33.153744 4925 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="200ms" Feb 26 08:49:33 crc kubenswrapper[4925]: E0226 08:49:33.354748 4925 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="400ms" Feb 26 08:49:33 crc kubenswrapper[4925]: E0226 08:49:33.756847 4925 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="800ms" Feb 26 08:49:33 crc kubenswrapper[4925]: E0226 08:49:33.850026 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:49:33Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:49:33Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:49:33Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:49:33Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:33 crc kubenswrapper[4925]: E0226 08:49:33.850405 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:33 crc kubenswrapper[4925]: E0226 08:49:33.850845 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:33 crc kubenswrapper[4925]: E0226 08:49:33.851112 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:33 crc kubenswrapper[4925]: E0226 08:49:33.851357 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:33 crc kubenswrapper[4925]: E0226 08:49:33.851381 4925 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 08:49:34 crc kubenswrapper[4925]: E0226 08:49:34.557847 4925 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="1.6s" Feb 26 08:49:36 crc kubenswrapper[4925]: E0226 08:49:36.160120 4925 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="3.2s" Feb 26 08:49:38 crc kubenswrapper[4925]: E0226 08:49:38.393126 4925 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.136:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1897bfb67f2edebe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 08:49:28.932310718 +0000 UTC m=+301.071885001,LastTimestamp:2026-02-26 08:49:28.932310718 +0000 UTC m=+301.071885001,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 08:49:38 crc kubenswrapper[4925]: I0226 08:49:38.511498 4925 status_manager.go:851] "Failed to get status for pod" podUID="c637160e-e623-415a-b0e4-9277dee94439" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-btcxz\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:38 crc kubenswrapper[4925]: I0226 08:49:38.512148 4925 status_manager.go:851] "Failed to get status for pod" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" pod="openshift-marketplace/redhat-marketplace-7pkxc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-7pkxc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:38 crc kubenswrapper[4925]: I0226 08:49:38.512639 4925 status_manager.go:851] "Failed to get status for pod" podUID="a11f03c1-788a-4e80-aaf8-02dd3a150814" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:39 crc kubenswrapper[4925]: E0226 08:49:39.361047 4925 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="6.4s" Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.507181 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.508406 4925 status_manager.go:851] "Failed to get status for pod" podUID="c637160e-e623-415a-b0e4-9277dee94439" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-btcxz\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.508797 4925 status_manager.go:851] "Failed to get status for pod" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" pod="openshift-marketplace/redhat-marketplace-7pkxc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-7pkxc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.509359 4925 status_manager.go:851] "Failed to get status for pod" podUID="a11f03c1-788a-4e80-aaf8-02dd3a150814" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.520658 4925 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8fe9a195-8e21-4517-b680-da420bb30a5e" Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.520721 4925 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8fe9a195-8e21-4517-b680-da420bb30a5e" Feb 26 08:49:43 crc kubenswrapper[4925]: E0226 08:49:43.521510 4925 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.522043 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:49:43 crc kubenswrapper[4925]: W0226 08:49:43.539928 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-1380da7a9dbccaa9292fc9852bb4aa25c72cf900b9d95fef3f0da10eef11d795 WatchSource:0}: Error finding container 1380da7a9dbccaa9292fc9852bb4aa25c72cf900b9d95fef3f0da10eef11d795: Status 404 returned error can't find the container with id 1380da7a9dbccaa9292fc9852bb4aa25c72cf900b9d95fef3f0da10eef11d795 Feb 26 08:49:43 crc kubenswrapper[4925]: E0226 08:49:43.562802 4925 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.136:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" volumeName="registry-storage" Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.707194 4925 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.707723 4925 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.854807 4925 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="c67940b5f3acaee7802fac3b0e5a75cde7512b9853965560bd499aecb68059c3" exitCode=0 Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.854886 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"c67940b5f3acaee7802fac3b0e5a75cde7512b9853965560bd499aecb68059c3"} Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.854935 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1380da7a9dbccaa9292fc9852bb4aa25c72cf900b9d95fef3f0da10eef11d795"} Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.855337 4925 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8fe9a195-8e21-4517-b680-da420bb30a5e" Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.855375 4925 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8fe9a195-8e21-4517-b680-da420bb30a5e" Feb 26 08:49:43 crc kubenswrapper[4925]: E0226 08:49:43.855863 4925 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.855860 4925 status_manager.go:851] "Failed to get status for pod" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" pod="openshift-marketplace/redhat-marketplace-7pkxc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-7pkxc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.856168 4925 status_manager.go:851] "Failed to get status for pod" podUID="a11f03c1-788a-4e80-aaf8-02dd3a150814" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.856382 4925 status_manager.go:851] "Failed to get status for pod" podUID="c637160e-e623-415a-b0e4-9277dee94439" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-btcxz\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.858926 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.859749 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.859804 4925 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="044a78201ffbc5765ab36a83077c152e1d0f0d1d05cd6407aab6851a06e3681b" exitCode=1 Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.859835 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"044a78201ffbc5765ab36a83077c152e1d0f0d1d05cd6407aab6851a06e3681b"} Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.860280 4925 scope.go:117] "RemoveContainer" containerID="044a78201ffbc5765ab36a83077c152e1d0f0d1d05cd6407aab6851a06e3681b" Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.860639 4925 status_manager.go:851] "Failed to get status for pod" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" pod="openshift-marketplace/redhat-marketplace-7pkxc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-7pkxc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.861089 4925 status_manager.go:851] "Failed to get status for pod" podUID="a11f03c1-788a-4e80-aaf8-02dd3a150814" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.861318 4925 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:43 crc kubenswrapper[4925]: I0226 08:49:43.861502 4925 status_manager.go:851] "Failed to get status for pod" podUID="c637160e-e623-415a-b0e4-9277dee94439" pod="openshift-authentication/oauth-openshift-558db77b4-btcxz" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-btcxz\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:44 crc kubenswrapper[4925]: E0226 08:49:44.219320 4925 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:49:44Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:49:44Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:49:44Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T08:49:44Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" Feb 26 08:49:44 crc kubenswrapper[4925]: I0226 08:49:44.876893 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"496dfd275e22c5a5e86938d6bbc32475a5c6a041741315fba2bc9d213c82b6e1"} Feb 26 08:49:44 crc kubenswrapper[4925]: I0226 08:49:44.876977 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"481a4c7ff95e85b8b8ba2df332949fa467a9bbb4c273d2bc494bf027b7a19e49"} Feb 26 08:49:44 crc kubenswrapper[4925]: I0226 08:49:44.876991 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6d5ed1b6a0023f8acaf88fffa82807eab07bafa0b4b7833c300df2bd0480f731"} Feb 26 08:49:44 crc kubenswrapper[4925]: I0226 08:49:44.877024 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6d4bc48353ed2130a4dfb1d9aaf6c26bb6258af71ba8ea456b3ff013dd0c2ac4"} Feb 26 08:49:44 crc kubenswrapper[4925]: I0226 08:49:44.887994 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 08:49:44 crc kubenswrapper[4925]: I0226 08:49:44.888740 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 26 08:49:44 crc kubenswrapper[4925]: I0226 08:49:44.888808 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0c9a42a718f2c3bb2163e897c3c5490f2d405220929480b928796e106195a986"} Feb 26 08:49:45 crc kubenswrapper[4925]: I0226 08:49:45.895982 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e8b766ff1e2b7b97fe5ce9b5bf8a9a5ee72c583be8711d0dacc1bec35515f66d"} Feb 26 08:49:45 crc kubenswrapper[4925]: I0226 08:49:45.896354 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:49:45 crc kubenswrapper[4925]: I0226 08:49:45.896426 4925 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8fe9a195-8e21-4517-b680-da420bb30a5e" Feb 26 08:49:45 crc kubenswrapper[4925]: I0226 08:49:45.896450 4925 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8fe9a195-8e21-4517-b680-da420bb30a5e" Feb 26 08:49:46 crc kubenswrapper[4925]: I0226 08:49:46.269131 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 08:49:46 crc kubenswrapper[4925]: I0226 08:49:46.272741 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 08:49:46 crc kubenswrapper[4925]: I0226 08:49:46.902424 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 08:49:48 crc kubenswrapper[4925]: I0226 08:49:48.522227 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:49:48 crc kubenswrapper[4925]: I0226 08:49:48.522269 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:49:48 crc kubenswrapper[4925]: I0226 08:49:48.527740 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:49:50 crc kubenswrapper[4925]: I0226 08:49:50.904327 4925 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:49:50 crc kubenswrapper[4925]: I0226 08:49:50.920143 4925 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8fe9a195-8e21-4517-b680-da420bb30a5e" Feb 26 08:49:50 crc kubenswrapper[4925]: I0226 08:49:50.920175 4925 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8fe9a195-8e21-4517-b680-da420bb30a5e" Feb 26 08:49:50 crc kubenswrapper[4925]: I0226 08:49:50.923603 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:49:50 crc kubenswrapper[4925]: I0226 08:49:50.925575 4925 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ecab3f38-62bd-4eee-ac8d-3617909b7563" Feb 26 08:49:51 crc kubenswrapper[4925]: I0226 08:49:51.925822 4925 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8fe9a195-8e21-4517-b680-da420bb30a5e" Feb 26 08:49:51 crc kubenswrapper[4925]: I0226 08:49:51.926174 4925 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8fe9a195-8e21-4517-b680-da420bb30a5e" Feb 26 08:49:58 crc kubenswrapper[4925]: I0226 08:49:58.515857 4925 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ecab3f38-62bd-4eee-ac8d-3617909b7563" Feb 26 08:50:00 crc kubenswrapper[4925]: I0226 08:50:00.707050 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 26 08:50:01 crc kubenswrapper[4925]: I0226 08:50:01.132491 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 26 08:50:01 crc kubenswrapper[4925]: I0226 08:50:01.638834 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 08:50:01 crc kubenswrapper[4925]: I0226 08:50:01.907515 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 26 08:50:01 crc kubenswrapper[4925]: I0226 08:50:01.924058 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 26 08:50:02 crc kubenswrapper[4925]: I0226 08:50:02.390898 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 26 08:50:02 crc kubenswrapper[4925]: I0226 08:50:02.505930 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 08:50:02 crc kubenswrapper[4925]: I0226 08:50:02.588464 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 26 08:50:02 crc kubenswrapper[4925]: I0226 08:50:02.593521 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 26 08:50:02 crc kubenswrapper[4925]: I0226 08:50:02.649792 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 08:50:02 crc kubenswrapper[4925]: I0226 08:50:02.694483 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 26 08:50:03 crc kubenswrapper[4925]: I0226 08:50:03.029140 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 26 08:50:03 crc kubenswrapper[4925]: I0226 08:50:03.048312 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 26 08:50:03 crc kubenswrapper[4925]: I0226 08:50:03.107362 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 26 08:50:03 crc kubenswrapper[4925]: I0226 08:50:03.191633 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 26 08:50:03 crc kubenswrapper[4925]: I0226 08:50:03.207599 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 26 08:50:03 crc kubenswrapper[4925]: I0226 08:50:03.487221 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 26 08:50:03 crc kubenswrapper[4925]: I0226 08:50:03.712111 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 08:50:03 crc kubenswrapper[4925]: I0226 08:50:03.721580 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 26 08:50:03 crc kubenswrapper[4925]: I0226 08:50:03.721580 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 26 08:50:03 crc kubenswrapper[4925]: I0226 08:50:03.833042 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 26 08:50:03 crc kubenswrapper[4925]: I0226 08:50:03.836249 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 26 08:50:03 crc kubenswrapper[4925]: I0226 08:50:03.854480 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 26 08:50:03 crc kubenswrapper[4925]: I0226 08:50:03.891069 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 26 08:50:03 crc kubenswrapper[4925]: I0226 08:50:03.933266 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 26 08:50:03 crc kubenswrapper[4925]: I0226 08:50:03.965489 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 26 08:50:04 crc kubenswrapper[4925]: I0226 08:50:04.034256 4925 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 26 08:50:04 crc kubenswrapper[4925]: I0226 08:50:04.115073 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 26 08:50:04 crc kubenswrapper[4925]: I0226 08:50:04.245748 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 08:50:04 crc kubenswrapper[4925]: I0226 08:50:04.313763 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 26 08:50:04 crc kubenswrapper[4925]: I0226 08:50:04.352213 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 26 08:50:04 crc kubenswrapper[4925]: I0226 08:50:04.446789 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 26 08:50:04 crc kubenswrapper[4925]: I0226 08:50:04.448384 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 26 08:50:04 crc kubenswrapper[4925]: I0226 08:50:04.449828 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 08:50:04 crc kubenswrapper[4925]: I0226 08:50:04.528915 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 26 08:50:04 crc kubenswrapper[4925]: I0226 08:50:04.560433 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 26 08:50:04 crc kubenswrapper[4925]: I0226 08:50:04.628481 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 26 08:50:04 crc kubenswrapper[4925]: I0226 08:50:04.647553 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 26 08:50:04 crc kubenswrapper[4925]: I0226 08:50:04.733157 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 26 08:50:04 crc kubenswrapper[4925]: I0226 08:50:04.759921 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 26 08:50:04 crc kubenswrapper[4925]: I0226 08:50:04.854099 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 26 08:50:04 crc kubenswrapper[4925]: I0226 08:50:04.883966 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.006383 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.012306 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.040013 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.071847 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.074999 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.151124 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.427158 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.502064 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.578297 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.594631 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.605705 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.632548 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.670368 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.727795 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.731990 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.798891 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.816740 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.826940 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.842065 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.867561 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.882657 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.909788 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.919466 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.919659 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.920816 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.930229 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 26 08:50:05 crc kubenswrapper[4925]: I0226 08:50:05.945465 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 26 08:50:06 crc kubenswrapper[4925]: I0226 08:50:06.017938 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 26 08:50:06 crc kubenswrapper[4925]: I0226 08:50:06.045473 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 26 08:50:06 crc kubenswrapper[4925]: I0226 08:50:06.046151 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 26 08:50:06 crc kubenswrapper[4925]: I0226 08:50:06.173493 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 26 08:50:06 crc kubenswrapper[4925]: I0226 08:50:06.176267 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 26 08:50:06 crc kubenswrapper[4925]: I0226 08:50:06.188172 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 26 08:50:06 crc kubenswrapper[4925]: I0226 08:50:06.198443 4925 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 26 08:50:06 crc kubenswrapper[4925]: I0226 08:50:06.273876 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 26 08:50:06 crc kubenswrapper[4925]: I0226 08:50:06.306934 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 08:50:06 crc kubenswrapper[4925]: I0226 08:50:06.413598 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 26 08:50:06 crc kubenswrapper[4925]: I0226 08:50:06.431648 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 26 08:50:06 crc kubenswrapper[4925]: I0226 08:50:06.461555 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 26 08:50:06 crc kubenswrapper[4925]: I0226 08:50:06.677131 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 26 08:50:06 crc kubenswrapper[4925]: I0226 08:50:06.684892 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 26 08:50:06 crc kubenswrapper[4925]: I0226 08:50:06.731340 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 26 08:50:06 crc kubenswrapper[4925]: I0226 08:50:06.805990 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 26 08:50:06 crc kubenswrapper[4925]: I0226 08:50:06.809653 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 26 08:50:06 crc kubenswrapper[4925]: I0226 08:50:06.823475 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 26 08:50:06 crc kubenswrapper[4925]: I0226 08:50:06.857455 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 26 08:50:06 crc kubenswrapper[4925]: I0226 08:50:06.865492 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 26 08:50:06 crc kubenswrapper[4925]: I0226 08:50:06.944289 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 26 08:50:06 crc kubenswrapper[4925]: I0226 08:50:06.959923 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 26 08:50:06 crc kubenswrapper[4925]: I0226 08:50:06.969152 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 26 08:50:07 crc kubenswrapper[4925]: I0226 08:50:07.006442 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 26 08:50:07 crc kubenswrapper[4925]: I0226 08:50:07.046318 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 26 08:50:07 crc kubenswrapper[4925]: I0226 08:50:07.150292 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 26 08:50:07 crc kubenswrapper[4925]: I0226 08:50:07.320931 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 26 08:50:07 crc kubenswrapper[4925]: I0226 08:50:07.401520 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 26 08:50:07 crc kubenswrapper[4925]: I0226 08:50:07.571312 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 26 08:50:07 crc kubenswrapper[4925]: I0226 08:50:07.595944 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 26 08:50:07 crc kubenswrapper[4925]: I0226 08:50:07.764630 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 26 08:50:07 crc kubenswrapper[4925]: I0226 08:50:07.780561 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 26 08:50:07 crc kubenswrapper[4925]: I0226 08:50:07.801289 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 08:50:07 crc kubenswrapper[4925]: I0226 08:50:07.803507 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 26 08:50:07 crc kubenswrapper[4925]: I0226 08:50:07.831552 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 26 08:50:07 crc kubenswrapper[4925]: I0226 08:50:07.973054 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 08:50:07 crc kubenswrapper[4925]: I0226 08:50:07.998359 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 26 08:50:08 crc kubenswrapper[4925]: I0226 08:50:08.042549 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 26 08:50:08 crc kubenswrapper[4925]: I0226 08:50:08.056718 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 26 08:50:08 crc kubenswrapper[4925]: I0226 08:50:08.065200 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 26 08:50:08 crc kubenswrapper[4925]: I0226 08:50:08.146773 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 26 08:50:08 crc kubenswrapper[4925]: I0226 08:50:08.202393 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 26 08:50:08 crc kubenswrapper[4925]: I0226 08:50:08.236670 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 26 08:50:08 crc kubenswrapper[4925]: I0226 08:50:08.296009 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 26 08:50:08 crc kubenswrapper[4925]: I0226 08:50:08.321241 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 26 08:50:08 crc kubenswrapper[4925]: I0226 08:50:08.373499 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 26 08:50:08 crc kubenswrapper[4925]: I0226 08:50:08.377968 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 26 08:50:08 crc kubenswrapper[4925]: I0226 08:50:08.389580 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 26 08:50:08 crc kubenswrapper[4925]: I0226 08:50:08.452118 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 26 08:50:08 crc kubenswrapper[4925]: I0226 08:50:08.550223 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 26 08:50:08 crc kubenswrapper[4925]: I0226 08:50:08.691840 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 26 08:50:08 crc kubenswrapper[4925]: I0226 08:50:08.745218 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 26 08:50:08 crc kubenswrapper[4925]: I0226 08:50:08.778098 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 26 08:50:08 crc kubenswrapper[4925]: I0226 08:50:08.887732 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 26 08:50:09 crc kubenswrapper[4925]: I0226 08:50:09.040029 4925 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 26 08:50:09 crc kubenswrapper[4925]: I0226 08:50:09.067319 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 26 08:50:09 crc kubenswrapper[4925]: I0226 08:50:09.237872 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 26 08:50:09 crc kubenswrapper[4925]: I0226 08:50:09.254333 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 26 08:50:09 crc kubenswrapper[4925]: I0226 08:50:09.348173 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 26 08:50:09 crc kubenswrapper[4925]: I0226 08:50:09.470677 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 26 08:50:09 crc kubenswrapper[4925]: I0226 08:50:09.559049 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 08:50:09 crc kubenswrapper[4925]: I0226 08:50:09.601912 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 26 08:50:09 crc kubenswrapper[4925]: I0226 08:50:09.675444 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 26 08:50:09 crc kubenswrapper[4925]: I0226 08:50:09.684004 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 26 08:50:09 crc kubenswrapper[4925]: I0226 08:50:09.711160 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 26 08:50:09 crc kubenswrapper[4925]: I0226 08:50:09.739598 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 26 08:50:09 crc kubenswrapper[4925]: I0226 08:50:09.919188 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 26 08:50:09 crc kubenswrapper[4925]: I0226 08:50:09.927809 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 26 08:50:09 crc kubenswrapper[4925]: I0226 08:50:09.937007 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 26 08:50:10 crc kubenswrapper[4925]: I0226 08:50:10.086305 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 26 08:50:10 crc kubenswrapper[4925]: I0226 08:50:10.144918 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 26 08:50:10 crc kubenswrapper[4925]: I0226 08:50:10.162046 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 26 08:50:10 crc kubenswrapper[4925]: I0226 08:50:10.172074 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 26 08:50:10 crc kubenswrapper[4925]: I0226 08:50:10.189172 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 26 08:50:10 crc kubenswrapper[4925]: I0226 08:50:10.200788 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 26 08:50:10 crc kubenswrapper[4925]: I0226 08:50:10.202128 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 26 08:50:10 crc kubenswrapper[4925]: I0226 08:50:10.248918 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 26 08:50:10 crc kubenswrapper[4925]: I0226 08:50:10.369394 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 26 08:50:10 crc kubenswrapper[4925]: I0226 08:50:10.461123 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 26 08:50:10 crc kubenswrapper[4925]: I0226 08:50:10.473263 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 26 08:50:10 crc kubenswrapper[4925]: I0226 08:50:10.502638 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 08:50:10 crc kubenswrapper[4925]: I0226 08:50:10.559370 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 26 08:50:10 crc kubenswrapper[4925]: I0226 08:50:10.568283 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 26 08:50:10 crc kubenswrapper[4925]: I0226 08:50:10.602682 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 26 08:50:10 crc kubenswrapper[4925]: I0226 08:50:10.772126 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 26 08:50:10 crc kubenswrapper[4925]: I0226 08:50:10.872952 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 08:50:11 crc kubenswrapper[4925]: I0226 08:50:11.075853 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 08:50:11 crc kubenswrapper[4925]: I0226 08:50:11.139293 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 26 08:50:11 crc kubenswrapper[4925]: I0226 08:50:11.151453 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 26 08:50:11 crc kubenswrapper[4925]: I0226 08:50:11.355146 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 26 08:50:11 crc kubenswrapper[4925]: I0226 08:50:11.444000 4925 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 26 08:50:11 crc kubenswrapper[4925]: I0226 08:50:11.504687 4925 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 26 08:50:11 crc kubenswrapper[4925]: I0226 08:50:11.520680 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 26 08:50:11 crc kubenswrapper[4925]: I0226 08:50:11.562773 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 26 08:50:11 crc kubenswrapper[4925]: I0226 08:50:11.667703 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 26 08:50:11 crc kubenswrapper[4925]: I0226 08:50:11.760971 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 26 08:50:11 crc kubenswrapper[4925]: I0226 08:50:11.790138 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 26 08:50:11 crc kubenswrapper[4925]: I0226 08:50:11.805106 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 26 08:50:11 crc kubenswrapper[4925]: I0226 08:50:11.810339 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 26 08:50:11 crc kubenswrapper[4925]: I0226 08:50:11.884604 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 26 08:50:11 crc kubenswrapper[4925]: I0226 08:50:11.915164 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 26 08:50:11 crc kubenswrapper[4925]: I0226 08:50:11.940182 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 26 08:50:11 crc kubenswrapper[4925]: I0226 08:50:11.949323 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 26 08:50:11 crc kubenswrapper[4925]: I0226 08:50:11.953863 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 26 08:50:12 crc kubenswrapper[4925]: I0226 08:50:12.081922 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 26 08:50:12 crc kubenswrapper[4925]: I0226 08:50:12.128124 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 26 08:50:12 crc kubenswrapper[4925]: I0226 08:50:12.130200 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 08:50:12 crc kubenswrapper[4925]: I0226 08:50:12.158221 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 26 08:50:12 crc kubenswrapper[4925]: I0226 08:50:12.206903 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 26 08:50:12 crc kubenswrapper[4925]: I0226 08:50:12.300415 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 26 08:50:12 crc kubenswrapper[4925]: I0226 08:50:12.321893 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 26 08:50:12 crc kubenswrapper[4925]: I0226 08:50:12.362722 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 26 08:50:12 crc kubenswrapper[4925]: I0226 08:50:12.410172 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 26 08:50:12 crc kubenswrapper[4925]: I0226 08:50:12.475248 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 26 08:50:12 crc kubenswrapper[4925]: I0226 08:50:12.480768 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 26 08:50:12 crc kubenswrapper[4925]: I0226 08:50:12.536165 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 26 08:50:12 crc kubenswrapper[4925]: I0226 08:50:12.547491 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 26 08:50:12 crc kubenswrapper[4925]: I0226 08:50:12.660819 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 26 08:50:12 crc kubenswrapper[4925]: I0226 08:50:12.689973 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 26 08:50:12 crc kubenswrapper[4925]: I0226 08:50:12.697438 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 26 08:50:12 crc kubenswrapper[4925]: I0226 08:50:12.701304 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 26 08:50:12 crc kubenswrapper[4925]: I0226 08:50:12.759027 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 26 08:50:12 crc kubenswrapper[4925]: I0226 08:50:12.772941 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 26 08:50:12 crc kubenswrapper[4925]: I0226 08:50:12.794950 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 26 08:50:12 crc kubenswrapper[4925]: I0226 08:50:12.804910 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.059864 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.074963 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.094419 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.098078 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.461815 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.476493 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.610604 4925 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.615193 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7pkxc","openshift-authentication/oauth-openshift-558db77b4-btcxz","openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.615256 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-infra/auto-csr-approver-29534930-h9hwr"] Feb 26 08:50:13 crc kubenswrapper[4925]: E0226 08:50:13.615456 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c637160e-e623-415a-b0e4-9277dee94439" containerName="oauth-openshift" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.615474 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="c637160e-e623-415a-b0e4-9277dee94439" containerName="oauth-openshift" Feb 26 08:50:13 crc kubenswrapper[4925]: E0226 08:50:13.615487 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" containerName="extract-content" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.615496 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" containerName="extract-content" Feb 26 08:50:13 crc kubenswrapper[4925]: E0226 08:50:13.615506 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" containerName="registry-server" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.615513 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" containerName="registry-server" Feb 26 08:50:13 crc kubenswrapper[4925]: E0226 08:50:13.615542 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" containerName="extract-utilities" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.615551 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" containerName="extract-utilities" Feb 26 08:50:13 crc kubenswrapper[4925]: E0226 08:50:13.615562 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a11f03c1-788a-4e80-aaf8-02dd3a150814" containerName="installer" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.615569 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="a11f03c1-788a-4e80-aaf8-02dd3a150814" containerName="installer" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.615677 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="a11f03c1-788a-4e80-aaf8-02dd3a150814" containerName="installer" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.615690 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" containerName="registry-server" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.615701 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="c637160e-e623-415a-b0e4-9277dee94439" containerName="oauth-openshift" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.616099 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534930-h9hwr" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.618192 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.618413 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.618890 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zdfj4" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.623334 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.635853 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=23.635835968 podStartE2EDuration="23.635835968s" podCreationTimestamp="2026-02-26 08:49:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:50:13.632360161 +0000 UTC m=+345.771934444" watchObservedRunningTime="2026-02-26 08:50:13.635835968 +0000 UTC m=+345.775410251" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.698639 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrhwm\" (UniqueName: \"kubernetes.io/projected/2ed05424-f6d1-49dc-a527-8a32a355c6e1-kube-api-access-lrhwm\") pod \"auto-csr-approver-29534930-h9hwr\" (UID: \"2ed05424-f6d1-49dc-a527-8a32a355c6e1\") " pod="openshift-infra/auto-csr-approver-29534930-h9hwr" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.761988 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.800140 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrhwm\" (UniqueName: \"kubernetes.io/projected/2ed05424-f6d1-49dc-a527-8a32a355c6e1-kube-api-access-lrhwm\") pod \"auto-csr-approver-29534930-h9hwr\" (UID: \"2ed05424-f6d1-49dc-a527-8a32a355c6e1\") " pod="openshift-infra/auto-csr-approver-29534930-h9hwr" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.823196 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrhwm\" (UniqueName: \"kubernetes.io/projected/2ed05424-f6d1-49dc-a527-8a32a355c6e1-kube-api-access-lrhwm\") pod \"auto-csr-approver-29534930-h9hwr\" (UID: \"2ed05424-f6d1-49dc-a527-8a32a355c6e1\") " pod="openshift-infra/auto-csr-approver-29534930-h9hwr" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.842253 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.848409 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 26 08:50:13 crc kubenswrapper[4925]: I0226 08:50:13.931702 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534930-h9hwr" Feb 26 08:50:14 crc kubenswrapper[4925]: I0226 08:50:14.030409 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 26 08:50:14 crc kubenswrapper[4925]: I0226 08:50:14.099080 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 26 08:50:14 crc kubenswrapper[4925]: I0226 08:50:14.119826 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 26 08:50:14 crc kubenswrapper[4925]: I0226 08:50:14.250358 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 26 08:50:14 crc kubenswrapper[4925]: I0226 08:50:14.271225 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 26 08:50:14 crc kubenswrapper[4925]: I0226 08:50:14.274998 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 26 08:50:14 crc kubenswrapper[4925]: I0226 08:50:14.309506 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29534930-h9hwr"] Feb 26 08:50:14 crc kubenswrapper[4925]: I0226 08:50:14.403561 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 26 08:50:14 crc kubenswrapper[4925]: I0226 08:50:14.422764 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 26 08:50:14 crc kubenswrapper[4925]: I0226 08:50:14.512854 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c637160e-e623-415a-b0e4-9277dee94439" path="/var/lib/kubelet/pods/c637160e-e623-415a-b0e4-9277dee94439/volumes" Feb 26 08:50:14 crc kubenswrapper[4925]: I0226 08:50:14.513631 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e99e124b-4917-49df-8c28-04373cfed9d3" path="/var/lib/kubelet/pods/e99e124b-4917-49df-8c28-04373cfed9d3/volumes" Feb 26 08:50:14 crc kubenswrapper[4925]: I0226 08:50:14.520665 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 26 08:50:14 crc kubenswrapper[4925]: I0226 08:50:14.535861 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 26 08:50:14 crc kubenswrapper[4925]: I0226 08:50:14.661826 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 26 08:50:14 crc kubenswrapper[4925]: I0226 08:50:14.716310 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 26 08:50:14 crc kubenswrapper[4925]: I0226 08:50:14.768669 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 26 08:50:14 crc kubenswrapper[4925]: I0226 08:50:14.818065 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 26 08:50:15 crc kubenswrapper[4925]: I0226 08:50:15.057755 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534930-h9hwr" event={"ID":"2ed05424-f6d1-49dc-a527-8a32a355c6e1","Type":"ContainerStarted","Data":"7fe4bd0b489fcc3f0b057b07e8abcf7909bb2e436eee680d5d9511ddebd941fe"} Feb 26 08:50:15 crc kubenswrapper[4925]: I0226 08:50:15.239148 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 26 08:50:15 crc kubenswrapper[4925]: I0226 08:50:15.252030 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 26 08:50:15 crc kubenswrapper[4925]: I0226 08:50:15.281491 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 26 08:50:15 crc kubenswrapper[4925]: I0226 08:50:15.377943 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 26 08:50:15 crc kubenswrapper[4925]: I0226 08:50:15.486520 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 26 08:50:15 crc kubenswrapper[4925]: I0226 08:50:15.773306 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 26 08:50:15 crc kubenswrapper[4925]: I0226 08:50:15.804304 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 26 08:50:15 crc kubenswrapper[4925]: I0226 08:50:15.869245 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 26 08:50:15 crc kubenswrapper[4925]: I0226 08:50:15.933573 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 26 08:50:15 crc kubenswrapper[4925]: I0226 08:50:15.969016 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.100841 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.128911 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.174011 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7897f8bf67-s64r4"] Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.176436 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.179101 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.179228 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.179310 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.179465 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.180066 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.180167 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.180678 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.182976 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7897f8bf67-s64r4"] Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.183486 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.183614 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.183668 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.183617 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.188968 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.189758 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.192614 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.203808 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.225753 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-user-template-login\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.225798 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ccd33b03-20a4-478e-8f60-c2fddf37f16f-audit-dir\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.225834 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-user-template-error\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.225850 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ccd33b03-20a4-478e-8f60-c2fddf37f16f-audit-policies\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.225874 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.225895 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.225914 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.225933 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-system-session\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.225955 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.225969 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.225989 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-system-service-ca\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.226006 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-system-router-certs\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.226160 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5sgt\" (UniqueName: \"kubernetes.io/projected/ccd33b03-20a4-478e-8f60-c2fddf37f16f-kube-api-access-b5sgt\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.226230 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.266207 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.327839 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.328171 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.328207 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-system-service-ca\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.328229 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-system-router-certs\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.328248 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5sgt\" (UniqueName: \"kubernetes.io/projected/ccd33b03-20a4-478e-8f60-c2fddf37f16f-kube-api-access-b5sgt\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.328301 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.328331 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-user-template-login\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.328354 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ccd33b03-20a4-478e-8f60-c2fddf37f16f-audit-dir\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.328393 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ccd33b03-20a4-478e-8f60-c2fddf37f16f-audit-policies\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.328412 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-user-template-error\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.328440 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.328464 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.328490 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.328518 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-system-session\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.328773 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.328843 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ccd33b03-20a4-478e-8f60-c2fddf37f16f-audit-dir\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.328950 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-system-service-ca\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.329625 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ccd33b03-20a4-478e-8f60-c2fddf37f16f-audit-policies\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.330312 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.334107 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-system-session\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.334161 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.334178 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.334551 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-system-router-certs\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.334595 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-user-template-login\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.335461 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-user-template-error\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.335670 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.336791 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ccd33b03-20a4-478e-8f60-c2fddf37f16f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.347244 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5sgt\" (UniqueName: \"kubernetes.io/projected/ccd33b03-20a4-478e-8f60-c2fddf37f16f-kube-api-access-b5sgt\") pod \"oauth-openshift-7897f8bf67-s64r4\" (UID: \"ccd33b03-20a4-478e-8f60-c2fddf37f16f\") " pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.500051 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.659889 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.781604 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.840888 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 26 08:50:16 crc kubenswrapper[4925]: I0226 08:50:16.882792 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7897f8bf67-s64r4"] Feb 26 08:50:16 crc kubenswrapper[4925]: W0226 08:50:16.889266 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccd33b03_20a4_478e_8f60_c2fddf37f16f.slice/crio-a92e58c3beafaadaba2a7b432dc4167a7cdc8b7ed40b6864e81e1284844ce25c WatchSource:0}: Error finding container a92e58c3beafaadaba2a7b432dc4167a7cdc8b7ed40b6864e81e1284844ce25c: Status 404 returned error can't find the container with id a92e58c3beafaadaba2a7b432dc4167a7cdc8b7ed40b6864e81e1284844ce25c Feb 26 08:50:17 crc kubenswrapper[4925]: I0226 08:50:17.069692 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" event={"ID":"ccd33b03-20a4-478e-8f60-c2fddf37f16f","Type":"ContainerStarted","Data":"a92e58c3beafaadaba2a7b432dc4167a7cdc8b7ed40b6864e81e1284844ce25c"} Feb 26 08:50:17 crc kubenswrapper[4925]: I0226 08:50:17.072279 4925 generic.go:334] "Generic (PLEG): container finished" podID="2ed05424-f6d1-49dc-a527-8a32a355c6e1" containerID="5edaf7089152f329b3773f419a7f9d8cd6e66429bd45532fa5478b27829ec4d0" exitCode=0 Feb 26 08:50:17 crc kubenswrapper[4925]: I0226 08:50:17.072344 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534930-h9hwr" event={"ID":"2ed05424-f6d1-49dc-a527-8a32a355c6e1","Type":"ContainerDied","Data":"5edaf7089152f329b3773f419a7f9d8cd6e66429bd45532fa5478b27829ec4d0"} Feb 26 08:50:17 crc kubenswrapper[4925]: I0226 08:50:17.876898 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 26 08:50:17 crc kubenswrapper[4925]: I0226 08:50:17.987681 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 26 08:50:18 crc kubenswrapper[4925]: I0226 08:50:18.080502 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" event={"ID":"ccd33b03-20a4-478e-8f60-c2fddf37f16f","Type":"ContainerStarted","Data":"98e7a17c9086ad534d35b1a7fb1970fc3ed089d1c38a85da3c55bdcac220f8fd"} Feb 26 08:50:18 crc kubenswrapper[4925]: I0226 08:50:18.106656 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" podStartSLOduration=75.106634511 podStartE2EDuration="1m15.106634511s" podCreationTimestamp="2026-02-26 08:49:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:50:18.102834116 +0000 UTC m=+350.242408429" watchObservedRunningTime="2026-02-26 08:50:18.106634511 +0000 UTC m=+350.246208854" Feb 26 08:50:18 crc kubenswrapper[4925]: I0226 08:50:18.356236 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534930-h9hwr" Feb 26 08:50:18 crc kubenswrapper[4925]: I0226 08:50:18.452781 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrhwm\" (UniqueName: \"kubernetes.io/projected/2ed05424-f6d1-49dc-a527-8a32a355c6e1-kube-api-access-lrhwm\") pod \"2ed05424-f6d1-49dc-a527-8a32a355c6e1\" (UID: \"2ed05424-f6d1-49dc-a527-8a32a355c6e1\") " Feb 26 08:50:18 crc kubenswrapper[4925]: I0226 08:50:18.458074 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ed05424-f6d1-49dc-a527-8a32a355c6e1-kube-api-access-lrhwm" (OuterVolumeSpecName: "kube-api-access-lrhwm") pod "2ed05424-f6d1-49dc-a527-8a32a355c6e1" (UID: "2ed05424-f6d1-49dc-a527-8a32a355c6e1"). InnerVolumeSpecName "kube-api-access-lrhwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:50:18 crc kubenswrapper[4925]: I0226 08:50:18.554508 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrhwm\" (UniqueName: \"kubernetes.io/projected/2ed05424-f6d1-49dc-a527-8a32a355c6e1-kube-api-access-lrhwm\") on node \"crc\" DevicePath \"\"" Feb 26 08:50:19 crc kubenswrapper[4925]: I0226 08:50:19.090089 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534930-h9hwr" event={"ID":"2ed05424-f6d1-49dc-a527-8a32a355c6e1","Type":"ContainerDied","Data":"7fe4bd0b489fcc3f0b057b07e8abcf7909bb2e436eee680d5d9511ddebd941fe"} Feb 26 08:50:19 crc kubenswrapper[4925]: I0226 08:50:19.090179 4925 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fe4bd0b489fcc3f0b057b07e8abcf7909bb2e436eee680d5d9511ddebd941fe" Feb 26 08:50:19 crc kubenswrapper[4925]: I0226 08:50:19.090111 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534930-h9hwr" Feb 26 08:50:19 crc kubenswrapper[4925]: I0226 08:50:19.090741 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:19 crc kubenswrapper[4925]: I0226 08:50:19.099373 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7897f8bf67-s64r4" Feb 26 08:50:24 crc kubenswrapper[4925]: I0226 08:50:24.613276 4925 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 26 08:50:24 crc kubenswrapper[4925]: I0226 08:50:24.613787 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://c3e875887d3713fbbe41e0d57043fbc94df80d6590717f0b11fd3477674162c8" gracePeriod=5 Feb 26 08:50:30 crc kubenswrapper[4925]: I0226 08:50:30.158339 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 26 08:50:30 crc kubenswrapper[4925]: I0226 08:50:30.159033 4925 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="c3e875887d3713fbbe41e0d57043fbc94df80d6590717f0b11fd3477674162c8" exitCode=137 Feb 26 08:50:30 crc kubenswrapper[4925]: I0226 08:50:30.159085 4925 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="129eb4a6299fa3f1929e749f669a36a26eecacd2de4b8e89e4c19865db1268ef" Feb 26 08:50:30 crc kubenswrapper[4925]: I0226 08:50:30.179937 4925 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 26 08:50:30 crc kubenswrapper[4925]: I0226 08:50:30.180246 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 08:50:30 crc kubenswrapper[4925]: I0226 08:50:30.301572 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 08:50:30 crc kubenswrapper[4925]: I0226 08:50:30.301664 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 08:50:30 crc kubenswrapper[4925]: I0226 08:50:30.301700 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 08:50:30 crc kubenswrapper[4925]: I0226 08:50:30.301722 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 08:50:30 crc kubenswrapper[4925]: I0226 08:50:30.301764 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 08:50:30 crc kubenswrapper[4925]: I0226 08:50:30.301791 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 08:50:30 crc kubenswrapper[4925]: I0226 08:50:30.301811 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 08:50:30 crc kubenswrapper[4925]: I0226 08:50:30.301801 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 08:50:30 crc kubenswrapper[4925]: I0226 08:50:30.301881 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 08:50:30 crc kubenswrapper[4925]: I0226 08:50:30.302184 4925 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 26 08:50:30 crc kubenswrapper[4925]: I0226 08:50:30.302197 4925 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 26 08:50:30 crc kubenswrapper[4925]: I0226 08:50:30.302205 4925 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 26 08:50:30 crc kubenswrapper[4925]: I0226 08:50:30.302216 4925 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 26 08:50:30 crc kubenswrapper[4925]: I0226 08:50:30.309619 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 08:50:30 crc kubenswrapper[4925]: I0226 08:50:30.403609 4925 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 26 08:50:30 crc kubenswrapper[4925]: I0226 08:50:30.515022 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 26 08:50:31 crc kubenswrapper[4925]: I0226 08:50:31.164613 4925 generic.go:334] "Generic (PLEG): container finished" podID="dabc4264-eb11-4a46-ad16-75e286b8f776" containerID="70fc8563458bbfe8669bd85ef9d0514f77ce897a3df52a72c89c814dcebc2c0b" exitCode=0 Feb 26 08:50:31 crc kubenswrapper[4925]: I0226 08:50:31.164689 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 08:50:31 crc kubenswrapper[4925]: I0226 08:50:31.164710 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" event={"ID":"dabc4264-eb11-4a46-ad16-75e286b8f776","Type":"ContainerDied","Data":"70fc8563458bbfe8669bd85ef9d0514f77ce897a3df52a72c89c814dcebc2c0b"} Feb 26 08:50:31 crc kubenswrapper[4925]: I0226 08:50:31.165156 4925 scope.go:117] "RemoveContainer" containerID="70fc8563458bbfe8669bd85ef9d0514f77ce897a3df52a72c89c814dcebc2c0b" Feb 26 08:50:31 crc kubenswrapper[4925]: I0226 08:50:31.920890 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 26 08:50:32 crc kubenswrapper[4925]: I0226 08:50:32.171291 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" event={"ID":"dabc4264-eb11-4a46-ad16-75e286b8f776","Type":"ContainerStarted","Data":"8e32cb5a035e48a28016c780c0400dde9596f63993f95a11fe629e77c7307159"} Feb 26 08:50:32 crc kubenswrapper[4925]: I0226 08:50:32.171640 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" Feb 26 08:50:32 crc kubenswrapper[4925]: I0226 08:50:32.176467 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.228181 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-f5wwj"] Feb 26 08:50:46 crc kubenswrapper[4925]: E0226 08:50:46.229851 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed05424-f6d1-49dc-a527-8a32a355c6e1" containerName="oc" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.229933 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed05424-f6d1-49dc-a527-8a32a355c6e1" containerName="oc" Feb 26 08:50:46 crc kubenswrapper[4925]: E0226 08:50:46.230003 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.230067 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.230262 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.230360 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ed05424-f6d1-49dc-a527-8a32a355c6e1" containerName="oc" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.230833 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.244758 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-f5wwj"] Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.372812 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kclsj\" (UniqueName: \"kubernetes.io/projected/dd2546ed-820a-4013-b35a-009df2c0cacc-kube-api-access-kclsj\") pod \"image-registry-66df7c8f76-f5wwj\" (UID: \"dd2546ed-820a-4013-b35a-009df2c0cacc\") " pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.372898 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-f5wwj\" (UID: \"dd2546ed-820a-4013-b35a-009df2c0cacc\") " pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.372935 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dd2546ed-820a-4013-b35a-009df2c0cacc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-f5wwj\" (UID: \"dd2546ed-820a-4013-b35a-009df2c0cacc\") " pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.372972 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dd2546ed-820a-4013-b35a-009df2c0cacc-registry-certificates\") pod \"image-registry-66df7c8f76-f5wwj\" (UID: \"dd2546ed-820a-4013-b35a-009df2c0cacc\") " pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.373021 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd2546ed-820a-4013-b35a-009df2c0cacc-bound-sa-token\") pod \"image-registry-66df7c8f76-f5wwj\" (UID: \"dd2546ed-820a-4013-b35a-009df2c0cacc\") " pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.373046 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dd2546ed-820a-4013-b35a-009df2c0cacc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-f5wwj\" (UID: \"dd2546ed-820a-4013-b35a-009df2c0cacc\") " pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.373111 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dd2546ed-820a-4013-b35a-009df2c0cacc-registry-tls\") pod \"image-registry-66df7c8f76-f5wwj\" (UID: \"dd2546ed-820a-4013-b35a-009df2c0cacc\") " pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.373144 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd2546ed-820a-4013-b35a-009df2c0cacc-trusted-ca\") pod \"image-registry-66df7c8f76-f5wwj\" (UID: \"dd2546ed-820a-4013-b35a-009df2c0cacc\") " pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.393834 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-f5wwj\" (UID: \"dd2546ed-820a-4013-b35a-009df2c0cacc\") " pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.474695 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd2546ed-820a-4013-b35a-009df2c0cacc-trusted-ca\") pod \"image-registry-66df7c8f76-f5wwj\" (UID: \"dd2546ed-820a-4013-b35a-009df2c0cacc\") " pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.474833 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kclsj\" (UniqueName: \"kubernetes.io/projected/dd2546ed-820a-4013-b35a-009df2c0cacc-kube-api-access-kclsj\") pod \"image-registry-66df7c8f76-f5wwj\" (UID: \"dd2546ed-820a-4013-b35a-009df2c0cacc\") " pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.474895 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dd2546ed-820a-4013-b35a-009df2c0cacc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-f5wwj\" (UID: \"dd2546ed-820a-4013-b35a-009df2c0cacc\") " pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.474945 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dd2546ed-820a-4013-b35a-009df2c0cacc-registry-certificates\") pod \"image-registry-66df7c8f76-f5wwj\" (UID: \"dd2546ed-820a-4013-b35a-009df2c0cacc\") " pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.474990 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd2546ed-820a-4013-b35a-009df2c0cacc-bound-sa-token\") pod \"image-registry-66df7c8f76-f5wwj\" (UID: \"dd2546ed-820a-4013-b35a-009df2c0cacc\") " pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.475025 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dd2546ed-820a-4013-b35a-009df2c0cacc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-f5wwj\" (UID: \"dd2546ed-820a-4013-b35a-009df2c0cacc\") " pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.475061 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dd2546ed-820a-4013-b35a-009df2c0cacc-registry-tls\") pod \"image-registry-66df7c8f76-f5wwj\" (UID: \"dd2546ed-820a-4013-b35a-009df2c0cacc\") " pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.475659 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dd2546ed-820a-4013-b35a-009df2c0cacc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-f5wwj\" (UID: \"dd2546ed-820a-4013-b35a-009df2c0cacc\") " pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.476341 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dd2546ed-820a-4013-b35a-009df2c0cacc-registry-certificates\") pod \"image-registry-66df7c8f76-f5wwj\" (UID: \"dd2546ed-820a-4013-b35a-009df2c0cacc\") " pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.477556 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd2546ed-820a-4013-b35a-009df2c0cacc-trusted-ca\") pod \"image-registry-66df7c8f76-f5wwj\" (UID: \"dd2546ed-820a-4013-b35a-009df2c0cacc\") " pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.487848 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dd2546ed-820a-4013-b35a-009df2c0cacc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-f5wwj\" (UID: \"dd2546ed-820a-4013-b35a-009df2c0cacc\") " pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.487924 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dd2546ed-820a-4013-b35a-009df2c0cacc-registry-tls\") pod \"image-registry-66df7c8f76-f5wwj\" (UID: \"dd2546ed-820a-4013-b35a-009df2c0cacc\") " pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.491065 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kclsj\" (UniqueName: \"kubernetes.io/projected/dd2546ed-820a-4013-b35a-009df2c0cacc-kube-api-access-kclsj\") pod \"image-registry-66df7c8f76-f5wwj\" (UID: \"dd2546ed-820a-4013-b35a-009df2c0cacc\") " pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.499400 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd2546ed-820a-4013-b35a-009df2c0cacc-bound-sa-token\") pod \"image-registry-66df7c8f76-f5wwj\" (UID: \"dd2546ed-820a-4013-b35a-009df2c0cacc\") " pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:46 crc kubenswrapper[4925]: I0226 08:50:46.557342 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:47 crc kubenswrapper[4925]: I0226 08:50:47.001389 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-f5wwj"] Feb 26 08:50:47 crc kubenswrapper[4925]: W0226 08:50:47.008202 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd2546ed_820a_4013_b35a_009df2c0cacc.slice/crio-4893ab1e5b3777fcd4d11af447291ce1fff6661866f6352a76869f3f3f8c9fc9 WatchSource:0}: Error finding container 4893ab1e5b3777fcd4d11af447291ce1fff6661866f6352a76869f3f3f8c9fc9: Status 404 returned error can't find the container with id 4893ab1e5b3777fcd4d11af447291ce1fff6661866f6352a76869f3f3f8c9fc9 Feb 26 08:50:47 crc kubenswrapper[4925]: I0226 08:50:47.536165 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" event={"ID":"dd2546ed-820a-4013-b35a-009df2c0cacc","Type":"ContainerStarted","Data":"8fbead80ebb473893b63a3c973396d4c2a4984f0a48fcd0be488aff5a9adcbfc"} Feb 26 08:50:47 crc kubenswrapper[4925]: I0226 08:50:47.536503 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:50:47 crc kubenswrapper[4925]: I0226 08:50:47.536516 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" event={"ID":"dd2546ed-820a-4013-b35a-009df2c0cacc","Type":"ContainerStarted","Data":"4893ab1e5b3777fcd4d11af447291ce1fff6661866f6352a76869f3f3f8c9fc9"} Feb 26 08:50:47 crc kubenswrapper[4925]: I0226 08:50:47.573695 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" podStartSLOduration=1.573659715 podStartE2EDuration="1.573659715s" podCreationTimestamp="2026-02-26 08:50:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:50:47.563290984 +0000 UTC m=+379.702865287" watchObservedRunningTime="2026-02-26 08:50:47.573659715 +0000 UTC m=+379.713234038" Feb 26 08:50:50 crc kubenswrapper[4925]: I0226 08:50:50.582584 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 26 08:51:06 crc kubenswrapper[4925]: I0226 08:51:06.561846 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-f5wwj" Feb 26 08:51:06 crc kubenswrapper[4925]: I0226 08:51:06.637229 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rrzqs"] Feb 26 08:51:31 crc kubenswrapper[4925]: I0226 08:51:31.677885 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" podUID="dd43ed89-2634-4c7a-a030-a468d0b1c481" containerName="registry" containerID="cri-o://7234b8d5dc8243821856cce1a3a904e872e85fa81f26e74d8777756ec254db93" gracePeriod=30 Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.065140 4925 generic.go:334] "Generic (PLEG): container finished" podID="dd43ed89-2634-4c7a-a030-a468d0b1c481" containerID="7234b8d5dc8243821856cce1a3a904e872e85fa81f26e74d8777756ec254db93" exitCode=0 Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.065206 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" event={"ID":"dd43ed89-2634-4c7a-a030-a468d0b1c481","Type":"ContainerDied","Data":"7234b8d5dc8243821856cce1a3a904e872e85fa81f26e74d8777756ec254db93"} Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.096000 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.122899 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dd43ed89-2634-4c7a-a030-a468d0b1c481-registry-certificates\") pod \"dd43ed89-2634-4c7a-a030-a468d0b1c481\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.123853 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"dd43ed89-2634-4c7a-a030-a468d0b1c481\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.123925 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dd43ed89-2634-4c7a-a030-a468d0b1c481-registry-tls\") pod \"dd43ed89-2634-4c7a-a030-a468d0b1c481\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.126948 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd43ed89-2634-4c7a-a030-a468d0b1c481-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "dd43ed89-2634-4c7a-a030-a468d0b1c481" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.133915 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd43ed89-2634-4c7a-a030-a468d0b1c481-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "dd43ed89-2634-4c7a-a030-a468d0b1c481" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.148547 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "dd43ed89-2634-4c7a-a030-a468d0b1c481" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.224956 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dd43ed89-2634-4c7a-a030-a468d0b1c481-ca-trust-extracted\") pod \"dd43ed89-2634-4c7a-a030-a468d0b1c481\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.225070 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zhr\" (UniqueName: \"kubernetes.io/projected/dd43ed89-2634-4c7a-a030-a468d0b1c481-kube-api-access-x4zhr\") pod \"dd43ed89-2634-4c7a-a030-a468d0b1c481\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.225157 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd43ed89-2634-4c7a-a030-a468d0b1c481-bound-sa-token\") pod \"dd43ed89-2634-4c7a-a030-a468d0b1c481\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.225187 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dd43ed89-2634-4c7a-a030-a468d0b1c481-installation-pull-secrets\") pod \"dd43ed89-2634-4c7a-a030-a468d0b1c481\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.225214 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd43ed89-2634-4c7a-a030-a468d0b1c481-trusted-ca\") pod \"dd43ed89-2634-4c7a-a030-a468d0b1c481\" (UID: \"dd43ed89-2634-4c7a-a030-a468d0b1c481\") " Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.225464 4925 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dd43ed89-2634-4c7a-a030-a468d0b1c481-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.225479 4925 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dd43ed89-2634-4c7a-a030-a468d0b1c481-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.226063 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd43ed89-2634-4c7a-a030-a468d0b1c481-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "dd43ed89-2634-4c7a-a030-a468d0b1c481" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.229022 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd43ed89-2634-4c7a-a030-a468d0b1c481-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "dd43ed89-2634-4c7a-a030-a468d0b1c481" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.229476 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd43ed89-2634-4c7a-a030-a468d0b1c481-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "dd43ed89-2634-4c7a-a030-a468d0b1c481" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.229851 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd43ed89-2634-4c7a-a030-a468d0b1c481-kube-api-access-x4zhr" (OuterVolumeSpecName: "kube-api-access-x4zhr") pod "dd43ed89-2634-4c7a-a030-a468d0b1c481" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481"). InnerVolumeSpecName "kube-api-access-x4zhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.241333 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd43ed89-2634-4c7a-a030-a468d0b1c481-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "dd43ed89-2634-4c7a-a030-a468d0b1c481" (UID: "dd43ed89-2634-4c7a-a030-a468d0b1c481"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.326682 4925 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd43ed89-2634-4c7a-a030-a468d0b1c481-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.326730 4925 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dd43ed89-2634-4c7a-a030-a468d0b1c481-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.326746 4925 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd43ed89-2634-4c7a-a030-a468d0b1c481-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.326759 4925 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dd43ed89-2634-4c7a-a030-a468d0b1c481-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 26 08:51:32 crc kubenswrapper[4925]: I0226 08:51:32.326771 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zhr\" (UniqueName: \"kubernetes.io/projected/dd43ed89-2634-4c7a-a030-a468d0b1c481-kube-api-access-x4zhr\") on node \"crc\" DevicePath \"\"" Feb 26 08:51:33 crc kubenswrapper[4925]: I0226 08:51:33.072845 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" event={"ID":"dd43ed89-2634-4c7a-a030-a468d0b1c481","Type":"ContainerDied","Data":"f6c84832e5cc37a02c8bd503ac7c59606523f51f8dee9a8a9069b52698edcb21"} Feb 26 08:51:33 crc kubenswrapper[4925]: I0226 08:51:33.072909 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-rrzqs" Feb 26 08:51:33 crc kubenswrapper[4925]: I0226 08:51:33.073200 4925 scope.go:117] "RemoveContainer" containerID="7234b8d5dc8243821856cce1a3a904e872e85fa81f26e74d8777756ec254db93" Feb 26 08:51:33 crc kubenswrapper[4925]: I0226 08:51:33.092573 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rrzqs"] Feb 26 08:51:33 crc kubenswrapper[4925]: I0226 08:51:33.097986 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-rrzqs"] Feb 26 08:51:34 crc kubenswrapper[4925]: I0226 08:51:34.518411 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd43ed89-2634-4c7a-a030-a468d0b1c481" path="/var/lib/kubelet/pods/dd43ed89-2634-4c7a-a030-a468d0b1c481/volumes" Feb 26 08:51:45 crc kubenswrapper[4925]: I0226 08:51:45.872031 4925 patch_prober.go:28] interesting pod/machine-config-daemon-s6pcl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 08:51:45 crc kubenswrapper[4925]: I0226 08:51:45.872672 4925 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" podUID="52bdf0f6-9afd-42fe-9a30-7da82bc7f619" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 08:51:50 crc kubenswrapper[4925]: I0226 08:51:50.937283 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x9xkw"] Feb 26 08:51:50 crc kubenswrapper[4925]: I0226 08:51:50.937997 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x9xkw" podUID="5a8eea64-29d5-46bc-89db-23ad451eba4f" containerName="registry-server" containerID="cri-o://5968d9351a8ad434084b5595573ee607bfaa6c8e0927f84642ce7833416d0062" gracePeriod=30 Feb 26 08:51:50 crc kubenswrapper[4925]: I0226 08:51:50.961516 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xgq9g"] Feb 26 08:51:50 crc kubenswrapper[4925]: I0226 08:51:50.961800 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xgq9g" podUID="b8aa32c8-5b82-46ac-a7e2-7d6399474aff" containerName="registry-server" containerID="cri-o://fb7ce64727a73d224841a28a1448600af4563f8cc3161964be5b24b1f67f6e50" gracePeriod=30 Feb 26 08:51:50 crc kubenswrapper[4925]: I0226 08:51:50.994814 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6nml7"] Feb 26 08:51:50 crc kubenswrapper[4925]: I0226 08:51:50.995084 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" podUID="dabc4264-eb11-4a46-ad16-75e286b8f776" containerName="marketplace-operator" containerID="cri-o://8e32cb5a035e48a28016c780c0400dde9596f63993f95a11fe629e77c7307159" gracePeriod=30 Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.003723 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q2rd7"] Feb 26 08:51:51 crc kubenswrapper[4925]: E0226 08:51:51.004006 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd43ed89-2634-4c7a-a030-a468d0b1c481" containerName="registry" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.004030 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd43ed89-2634-4c7a-a030-a468d0b1c481" containerName="registry" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.004170 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd43ed89-2634-4c7a-a030-a468d0b1c481" containerName="registry" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.006325 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q2rd7" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.007597 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ea55f9b9-fff3-40ec-b50c-885ada424722-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q2rd7\" (UID: \"ea55f9b9-fff3-40ec-b50c-885ada424722\") " pod="openshift-marketplace/marketplace-operator-79b997595-q2rd7" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.007645 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea55f9b9-fff3-40ec-b50c-885ada424722-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q2rd7\" (UID: \"ea55f9b9-fff3-40ec-b50c-885ada424722\") " pod="openshift-marketplace/marketplace-operator-79b997595-q2rd7" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.007706 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brmww\" (UniqueName: \"kubernetes.io/projected/ea55f9b9-fff3-40ec-b50c-885ada424722-kube-api-access-brmww\") pod \"marketplace-operator-79b997595-q2rd7\" (UID: \"ea55f9b9-fff3-40ec-b50c-885ada424722\") " pod="openshift-marketplace/marketplace-operator-79b997595-q2rd7" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.010897 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mrq8g"] Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.011297 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mrq8g" podUID="ce17c01b-f0a9-4f82-9512-028c8f673f85" containerName="registry-server" containerID="cri-o://ab4c89104c314fc942ee59ae17bf5cecafa7ef15189c401d3e135a7b520c22b8" gracePeriod=30 Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.014506 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q2rd7"] Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.017647 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zs499"] Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.017920 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zs499" podUID="75ccca07-d75b-4817-b203-931dc51638f4" containerName="registry-server" containerID="cri-o://86900f507ee604438c059aa08e874c57a98d6aef7d5045e683084aac6f23b123" gracePeriod=30 Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.109303 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ea55f9b9-fff3-40ec-b50c-885ada424722-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q2rd7\" (UID: \"ea55f9b9-fff3-40ec-b50c-885ada424722\") " pod="openshift-marketplace/marketplace-operator-79b997595-q2rd7" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.109385 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea55f9b9-fff3-40ec-b50c-885ada424722-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q2rd7\" (UID: \"ea55f9b9-fff3-40ec-b50c-885ada424722\") " pod="openshift-marketplace/marketplace-operator-79b997595-q2rd7" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.109462 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brmww\" (UniqueName: \"kubernetes.io/projected/ea55f9b9-fff3-40ec-b50c-885ada424722-kube-api-access-brmww\") pod \"marketplace-operator-79b997595-q2rd7\" (UID: \"ea55f9b9-fff3-40ec-b50c-885ada424722\") " pod="openshift-marketplace/marketplace-operator-79b997595-q2rd7" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.112384 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea55f9b9-fff3-40ec-b50c-885ada424722-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q2rd7\" (UID: \"ea55f9b9-fff3-40ec-b50c-885ada424722\") " pod="openshift-marketplace/marketplace-operator-79b997595-q2rd7" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.117485 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ea55f9b9-fff3-40ec-b50c-885ada424722-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q2rd7\" (UID: \"ea55f9b9-fff3-40ec-b50c-885ada424722\") " pod="openshift-marketplace/marketplace-operator-79b997595-q2rd7" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.128735 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brmww\" (UniqueName: \"kubernetes.io/projected/ea55f9b9-fff3-40ec-b50c-885ada424722-kube-api-access-brmww\") pod \"marketplace-operator-79b997595-q2rd7\" (UID: \"ea55f9b9-fff3-40ec-b50c-885ada424722\") " pod="openshift-marketplace/marketplace-operator-79b997595-q2rd7" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.199899 4925 generic.go:334] "Generic (PLEG): container finished" podID="75ccca07-d75b-4817-b203-931dc51638f4" containerID="86900f507ee604438c059aa08e874c57a98d6aef7d5045e683084aac6f23b123" exitCode=0 Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.199954 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs499" event={"ID":"75ccca07-d75b-4817-b203-931dc51638f4","Type":"ContainerDied","Data":"86900f507ee604438c059aa08e874c57a98d6aef7d5045e683084aac6f23b123"} Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.208901 4925 generic.go:334] "Generic (PLEG): container finished" podID="dabc4264-eb11-4a46-ad16-75e286b8f776" containerID="8e32cb5a035e48a28016c780c0400dde9596f63993f95a11fe629e77c7307159" exitCode=0 Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.208972 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" event={"ID":"dabc4264-eb11-4a46-ad16-75e286b8f776","Type":"ContainerDied","Data":"8e32cb5a035e48a28016c780c0400dde9596f63993f95a11fe629e77c7307159"} Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.209010 4925 scope.go:117] "RemoveContainer" containerID="70fc8563458bbfe8669bd85ef9d0514f77ce897a3df52a72c89c814dcebc2c0b" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.224268 4925 generic.go:334] "Generic (PLEG): container finished" podID="ce17c01b-f0a9-4f82-9512-028c8f673f85" containerID="ab4c89104c314fc942ee59ae17bf5cecafa7ef15189c401d3e135a7b520c22b8" exitCode=0 Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.224344 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mrq8g" event={"ID":"ce17c01b-f0a9-4f82-9512-028c8f673f85","Type":"ContainerDied","Data":"ab4c89104c314fc942ee59ae17bf5cecafa7ef15189c401d3e135a7b520c22b8"} Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.227573 4925 generic.go:334] "Generic (PLEG): container finished" podID="5a8eea64-29d5-46bc-89db-23ad451eba4f" containerID="5968d9351a8ad434084b5595573ee607bfaa6c8e0927f84642ce7833416d0062" exitCode=0 Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.227643 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x9xkw" event={"ID":"5a8eea64-29d5-46bc-89db-23ad451eba4f","Type":"ContainerDied","Data":"5968d9351a8ad434084b5595573ee607bfaa6c8e0927f84642ce7833416d0062"} Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.231692 4925 generic.go:334] "Generic (PLEG): container finished" podID="b8aa32c8-5b82-46ac-a7e2-7d6399474aff" containerID="fb7ce64727a73d224841a28a1448600af4563f8cc3161964be5b24b1f67f6e50" exitCode=0 Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.231749 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgq9g" event={"ID":"b8aa32c8-5b82-46ac-a7e2-7d6399474aff","Type":"ContainerDied","Data":"fb7ce64727a73d224841a28a1448600af4563f8cc3161964be5b24b1f67f6e50"} Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.414190 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q2rd7" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.417454 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgq9g" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.419483 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x9xkw" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.427388 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mrq8g" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.430382 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zs499" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.456400 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.615946 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8aa32c8-5b82-46ac-a7e2-7d6399474aff-catalog-content\") pod \"b8aa32c8-5b82-46ac-a7e2-7d6399474aff\" (UID: \"b8aa32c8-5b82-46ac-a7e2-7d6399474aff\") " Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.615988 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvprv\" (UniqueName: \"kubernetes.io/projected/75ccca07-d75b-4817-b203-931dc51638f4-kube-api-access-tvprv\") pod \"75ccca07-d75b-4817-b203-931dc51638f4\" (UID: \"75ccca07-d75b-4817-b203-931dc51638f4\") " Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.616009 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dabc4264-eb11-4a46-ad16-75e286b8f776-marketplace-operator-metrics\") pod \"dabc4264-eb11-4a46-ad16-75e286b8f776\" (UID: \"dabc4264-eb11-4a46-ad16-75e286b8f776\") " Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.616029 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce17c01b-f0a9-4f82-9512-028c8f673f85-utilities\") pod \"ce17c01b-f0a9-4f82-9512-028c8f673f85\" (UID: \"ce17c01b-f0a9-4f82-9512-028c8f673f85\") " Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.616056 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a8eea64-29d5-46bc-89db-23ad451eba4f-catalog-content\") pod \"5a8eea64-29d5-46bc-89db-23ad451eba4f\" (UID: \"5a8eea64-29d5-46bc-89db-23ad451eba4f\") " Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.616076 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75ccca07-d75b-4817-b203-931dc51638f4-catalog-content\") pod \"75ccca07-d75b-4817-b203-931dc51638f4\" (UID: \"75ccca07-d75b-4817-b203-931dc51638f4\") " Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.616096 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czrnm\" (UniqueName: \"kubernetes.io/projected/b8aa32c8-5b82-46ac-a7e2-7d6399474aff-kube-api-access-czrnm\") pod \"b8aa32c8-5b82-46ac-a7e2-7d6399474aff\" (UID: \"b8aa32c8-5b82-46ac-a7e2-7d6399474aff\") " Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.616120 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75ccca07-d75b-4817-b203-931dc51638f4-utilities\") pod \"75ccca07-d75b-4817-b203-931dc51638f4\" (UID: \"75ccca07-d75b-4817-b203-931dc51638f4\") " Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.616160 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6tb5\" (UniqueName: \"kubernetes.io/projected/5a8eea64-29d5-46bc-89db-23ad451eba4f-kube-api-access-j6tb5\") pod \"5a8eea64-29d5-46bc-89db-23ad451eba4f\" (UID: \"5a8eea64-29d5-46bc-89db-23ad451eba4f\") " Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.616191 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dabc4264-eb11-4a46-ad16-75e286b8f776-marketplace-trusted-ca\") pod \"dabc4264-eb11-4a46-ad16-75e286b8f776\" (UID: \"dabc4264-eb11-4a46-ad16-75e286b8f776\") " Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.616211 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce17c01b-f0a9-4f82-9512-028c8f673f85-catalog-content\") pod \"ce17c01b-f0a9-4f82-9512-028c8f673f85\" (UID: \"ce17c01b-f0a9-4f82-9512-028c8f673f85\") " Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.616229 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nff7k\" (UniqueName: \"kubernetes.io/projected/ce17c01b-f0a9-4f82-9512-028c8f673f85-kube-api-access-nff7k\") pod \"ce17c01b-f0a9-4f82-9512-028c8f673f85\" (UID: \"ce17c01b-f0a9-4f82-9512-028c8f673f85\") " Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.616242 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8aa32c8-5b82-46ac-a7e2-7d6399474aff-utilities\") pod \"b8aa32c8-5b82-46ac-a7e2-7d6399474aff\" (UID: \"b8aa32c8-5b82-46ac-a7e2-7d6399474aff\") " Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.616287 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a8eea64-29d5-46bc-89db-23ad451eba4f-utilities\") pod \"5a8eea64-29d5-46bc-89db-23ad451eba4f\" (UID: \"5a8eea64-29d5-46bc-89db-23ad451eba4f\") " Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.616320 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67pdr\" (UniqueName: \"kubernetes.io/projected/dabc4264-eb11-4a46-ad16-75e286b8f776-kube-api-access-67pdr\") pod \"dabc4264-eb11-4a46-ad16-75e286b8f776\" (UID: \"dabc4264-eb11-4a46-ad16-75e286b8f776\") " Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.619006 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce17c01b-f0a9-4f82-9512-028c8f673f85-utilities" (OuterVolumeSpecName: "utilities") pod "ce17c01b-f0a9-4f82-9512-028c8f673f85" (UID: "ce17c01b-f0a9-4f82-9512-028c8f673f85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.621000 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75ccca07-d75b-4817-b203-931dc51638f4-utilities" (OuterVolumeSpecName: "utilities") pod "75ccca07-d75b-4817-b203-931dc51638f4" (UID: "75ccca07-d75b-4817-b203-931dc51638f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.621581 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8aa32c8-5b82-46ac-a7e2-7d6399474aff-utilities" (OuterVolumeSpecName: "utilities") pod "b8aa32c8-5b82-46ac-a7e2-7d6399474aff" (UID: "b8aa32c8-5b82-46ac-a7e2-7d6399474aff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.622442 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a8eea64-29d5-46bc-89db-23ad451eba4f-utilities" (OuterVolumeSpecName: "utilities") pod "5a8eea64-29d5-46bc-89db-23ad451eba4f" (UID: "5a8eea64-29d5-46bc-89db-23ad451eba4f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.623703 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce17c01b-f0a9-4f82-9512-028c8f673f85-kube-api-access-nff7k" (OuterVolumeSpecName: "kube-api-access-nff7k") pod "ce17c01b-f0a9-4f82-9512-028c8f673f85" (UID: "ce17c01b-f0a9-4f82-9512-028c8f673f85"). InnerVolumeSpecName "kube-api-access-nff7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.622895 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dabc4264-eb11-4a46-ad16-75e286b8f776-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "dabc4264-eb11-4a46-ad16-75e286b8f776" (UID: "dabc4264-eb11-4a46-ad16-75e286b8f776"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.624888 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a8eea64-29d5-46bc-89db-23ad451eba4f-kube-api-access-j6tb5" (OuterVolumeSpecName: "kube-api-access-j6tb5") pod "5a8eea64-29d5-46bc-89db-23ad451eba4f" (UID: "5a8eea64-29d5-46bc-89db-23ad451eba4f"). InnerVolumeSpecName "kube-api-access-j6tb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.626462 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabc4264-eb11-4a46-ad16-75e286b8f776-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "dabc4264-eb11-4a46-ad16-75e286b8f776" (UID: "dabc4264-eb11-4a46-ad16-75e286b8f776"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.630728 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75ccca07-d75b-4817-b203-931dc51638f4-kube-api-access-tvprv" (OuterVolumeSpecName: "kube-api-access-tvprv") pod "75ccca07-d75b-4817-b203-931dc51638f4" (UID: "75ccca07-d75b-4817-b203-931dc51638f4"). InnerVolumeSpecName "kube-api-access-tvprv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.632236 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q2rd7"] Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.641418 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dabc4264-eb11-4a46-ad16-75e286b8f776-kube-api-access-67pdr" (OuterVolumeSpecName: "kube-api-access-67pdr") pod "dabc4264-eb11-4a46-ad16-75e286b8f776" (UID: "dabc4264-eb11-4a46-ad16-75e286b8f776"). InnerVolumeSpecName "kube-api-access-67pdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.647351 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8aa32c8-5b82-46ac-a7e2-7d6399474aff-kube-api-access-czrnm" (OuterVolumeSpecName: "kube-api-access-czrnm") pod "b8aa32c8-5b82-46ac-a7e2-7d6399474aff" (UID: "b8aa32c8-5b82-46ac-a7e2-7d6399474aff"). InnerVolumeSpecName "kube-api-access-czrnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.667191 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce17c01b-f0a9-4f82-9512-028c8f673f85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce17c01b-f0a9-4f82-9512-028c8f673f85" (UID: "ce17c01b-f0a9-4f82-9512-028c8f673f85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.701070 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a8eea64-29d5-46bc-89db-23ad451eba4f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a8eea64-29d5-46bc-89db-23ad451eba4f" (UID: "5a8eea64-29d5-46bc-89db-23ad451eba4f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.711155 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8aa32c8-5b82-46ac-a7e2-7d6399474aff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8aa32c8-5b82-46ac-a7e2-7d6399474aff" (UID: "b8aa32c8-5b82-46ac-a7e2-7d6399474aff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.718867 4925 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a8eea64-29d5-46bc-89db-23ad451eba4f-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.718920 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67pdr\" (UniqueName: \"kubernetes.io/projected/dabc4264-eb11-4a46-ad16-75e286b8f776-kube-api-access-67pdr\") on node \"crc\" DevicePath \"\"" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.718970 4925 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8aa32c8-5b82-46ac-a7e2-7d6399474aff-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.718993 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvprv\" (UniqueName: \"kubernetes.io/projected/75ccca07-d75b-4817-b203-931dc51638f4-kube-api-access-tvprv\") on node \"crc\" DevicePath \"\"" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.719012 4925 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dabc4264-eb11-4a46-ad16-75e286b8f776-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.719031 4925 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce17c01b-f0a9-4f82-9512-028c8f673f85-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.719047 4925 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a8eea64-29d5-46bc-89db-23ad451eba4f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.719064 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czrnm\" (UniqueName: \"kubernetes.io/projected/b8aa32c8-5b82-46ac-a7e2-7d6399474aff-kube-api-access-czrnm\") on node \"crc\" DevicePath \"\"" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.719082 4925 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75ccca07-d75b-4817-b203-931dc51638f4-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.719101 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6tb5\" (UniqueName: \"kubernetes.io/projected/5a8eea64-29d5-46bc-89db-23ad451eba4f-kube-api-access-j6tb5\") on node \"crc\" DevicePath \"\"" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.719117 4925 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dabc4264-eb11-4a46-ad16-75e286b8f776-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.719134 4925 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce17c01b-f0a9-4f82-9512-028c8f673f85-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.719151 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nff7k\" (UniqueName: \"kubernetes.io/projected/ce17c01b-f0a9-4f82-9512-028c8f673f85-kube-api-access-nff7k\") on node \"crc\" DevicePath \"\"" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.719167 4925 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8aa32c8-5b82-46ac-a7e2-7d6399474aff-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.776653 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75ccca07-d75b-4817-b203-931dc51638f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75ccca07-d75b-4817-b203-931dc51638f4" (UID: "75ccca07-d75b-4817-b203-931dc51638f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 08:51:51 crc kubenswrapper[4925]: I0226 08:51:51.820491 4925 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75ccca07-d75b-4817-b203-931dc51638f4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.238556 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" event={"ID":"dabc4264-eb11-4a46-ad16-75e286b8f776","Type":"ContainerDied","Data":"c15307215d40d3c732a4cd731806534419cbcb80148c806244169bf18d5d180b"} Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.238575 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6nml7" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.238639 4925 scope.go:117] "RemoveContainer" containerID="8e32cb5a035e48a28016c780c0400dde9596f63993f95a11fe629e77c7307159" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.241359 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mrq8g" event={"ID":"ce17c01b-f0a9-4f82-9512-028c8f673f85","Type":"ContainerDied","Data":"39e049e87669dbaebcdddcf0c2cc23f541fd72c9f7ece7b55b5b84e1355be411"} Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.241460 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mrq8g" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.246957 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x9xkw" event={"ID":"5a8eea64-29d5-46bc-89db-23ad451eba4f","Type":"ContainerDied","Data":"a67205620aa955172bda0954fb651162873aa1a19d3b7052cbf60e351f1c196c"} Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.247043 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x9xkw" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.256267 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgq9g" event={"ID":"b8aa32c8-5b82-46ac-a7e2-7d6399474aff","Type":"ContainerDied","Data":"033e63046163c4208d399429317a7538136001b1e5de8831f43bbe95c1c21611"} Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.256333 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgq9g" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.261245 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zs499" event={"ID":"75ccca07-d75b-4817-b203-931dc51638f4","Type":"ContainerDied","Data":"1e5a2c8008f7435ae1f2deb5b407d50be54c6e6b45d48c61c4dfeac8f972e627"} Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.261356 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zs499" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.265982 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q2rd7" event={"ID":"ea55f9b9-fff3-40ec-b50c-885ada424722","Type":"ContainerStarted","Data":"b8e6d686a0967ea3bcd5a96a79c3c9b60f5c0fbe5e056b924ec28cf8527430d9"} Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.266021 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q2rd7" event={"ID":"ea55f9b9-fff3-40ec-b50c-885ada424722","Type":"ContainerStarted","Data":"45a214362f339ea48f915ac446a23635589dde237a729a7c002db0791ca27b01"} Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.266514 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-q2rd7" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.270241 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-q2rd7" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.275427 4925 scope.go:117] "RemoveContainer" containerID="ab4c89104c314fc942ee59ae17bf5cecafa7ef15189c401d3e135a7b520c22b8" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.289016 4925 scope.go:117] "RemoveContainer" containerID="795f784e1b60ad89295544dc1390e730f3c25ffcb77de7bc20af4e3016f3f5ab" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.419906 4925 scope.go:117] "RemoveContainer" containerID="c8b1e07a0d3639741d9d0d68331d2a35def484baf505cd2b7b64279b21456d5d" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.423717 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-q2rd7" podStartSLOduration=2.423698321 podStartE2EDuration="2.423698321s" podCreationTimestamp="2026-02-26 08:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 08:51:52.419230666 +0000 UTC m=+444.558804979" watchObservedRunningTime="2026-02-26 08:51:52.423698321 +0000 UTC m=+444.563272604" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.466033 4925 scope.go:117] "RemoveContainer" containerID="5968d9351a8ad434084b5595573ee607bfaa6c8e0927f84642ce7833416d0062" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.468345 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x9xkw"] Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.474134 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x9xkw"] Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.479889 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6nml7"] Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.483175 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6nml7"] Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.490774 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xgq9g"] Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.491890 4925 scope.go:117] "RemoveContainer" containerID="c22845da049700b9ec7327b515fba0bb7b11c146c7803d1b90c759b7b277467d" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.495505 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xgq9g"] Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.505272 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mrq8g"] Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.521829 4925 scope.go:117] "RemoveContainer" containerID="58fd9e548586058b4734a234cde7c37a81649b2d5fd3089d74399ca701060647" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.525208 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a8eea64-29d5-46bc-89db-23ad451eba4f" path="/var/lib/kubelet/pods/5a8eea64-29d5-46bc-89db-23ad451eba4f/volumes" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.526011 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8aa32c8-5b82-46ac-a7e2-7d6399474aff" path="/var/lib/kubelet/pods/b8aa32c8-5b82-46ac-a7e2-7d6399474aff/volumes" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.526774 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dabc4264-eb11-4a46-ad16-75e286b8f776" path="/var/lib/kubelet/pods/dabc4264-eb11-4a46-ad16-75e286b8f776/volumes" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.530612 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mrq8g"] Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.530725 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zs499"] Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.534265 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zs499"] Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.536758 4925 scope.go:117] "RemoveContainer" containerID="fb7ce64727a73d224841a28a1448600af4563f8cc3161964be5b24b1f67f6e50" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.552221 4925 scope.go:117] "RemoveContainer" containerID="73a6e07adb99d67f8d2c0f1fb2ec016c86a5e24c69a252f8e78d8620ee3b3e34" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.564513 4925 scope.go:117] "RemoveContainer" containerID="9f17818c45f5e5096157d6abdd941bb28b9a700fa55783eb4b58174eea28deb9" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.585132 4925 scope.go:117] "RemoveContainer" containerID="86900f507ee604438c059aa08e874c57a98d6aef7d5045e683084aac6f23b123" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.597742 4925 scope.go:117] "RemoveContainer" containerID="5f1d6cfdde00966b8efd39d6f328135037a27eac453f379e4a94a477343b99cc" Feb 26 08:51:52 crc kubenswrapper[4925]: I0226 08:51:52.613904 4925 scope.go:117] "RemoveContainer" containerID="8d15b13e080012da2c5ea19ae09f5b1e75cf083730a9853a8839af06a8a4cc24" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.958047 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gzzc6"] Feb 26 08:51:53 crc kubenswrapper[4925]: E0226 08:51:53.960493 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8aa32c8-5b82-46ac-a7e2-7d6399474aff" containerName="extract-content" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.960612 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8aa32c8-5b82-46ac-a7e2-7d6399474aff" containerName="extract-content" Feb 26 08:51:53 crc kubenswrapper[4925]: E0226 08:51:53.960703 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75ccca07-d75b-4817-b203-931dc51638f4" containerName="extract-content" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.960782 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="75ccca07-d75b-4817-b203-931dc51638f4" containerName="extract-content" Feb 26 08:51:53 crc kubenswrapper[4925]: E0226 08:51:53.960892 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75ccca07-d75b-4817-b203-931dc51638f4" containerName="registry-server" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.960957 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="75ccca07-d75b-4817-b203-931dc51638f4" containerName="registry-server" Feb 26 08:51:53 crc kubenswrapper[4925]: E0226 08:51:53.961043 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a8eea64-29d5-46bc-89db-23ad451eba4f" containerName="extract-content" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.961126 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a8eea64-29d5-46bc-89db-23ad451eba4f" containerName="extract-content" Feb 26 08:51:53 crc kubenswrapper[4925]: E0226 08:51:53.961251 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a8eea64-29d5-46bc-89db-23ad451eba4f" containerName="registry-server" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.961310 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a8eea64-29d5-46bc-89db-23ad451eba4f" containerName="registry-server" Feb 26 08:51:53 crc kubenswrapper[4925]: E0226 08:51:53.961366 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce17c01b-f0a9-4f82-9512-028c8f673f85" containerName="extract-utilities" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.961426 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce17c01b-f0a9-4f82-9512-028c8f673f85" containerName="extract-utilities" Feb 26 08:51:53 crc kubenswrapper[4925]: E0226 08:51:53.961482 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce17c01b-f0a9-4f82-9512-028c8f673f85" containerName="extract-content" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.961559 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce17c01b-f0a9-4f82-9512-028c8f673f85" containerName="extract-content" Feb 26 08:51:53 crc kubenswrapper[4925]: E0226 08:51:53.961638 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75ccca07-d75b-4817-b203-931dc51638f4" containerName="extract-utilities" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.961726 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="75ccca07-d75b-4817-b203-931dc51638f4" containerName="extract-utilities" Feb 26 08:51:53 crc kubenswrapper[4925]: E0226 08:51:53.961808 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a8eea64-29d5-46bc-89db-23ad451eba4f" containerName="extract-utilities" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.961887 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a8eea64-29d5-46bc-89db-23ad451eba4f" containerName="extract-utilities" Feb 26 08:51:53 crc kubenswrapper[4925]: E0226 08:51:53.961968 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8aa32c8-5b82-46ac-a7e2-7d6399474aff" containerName="registry-server" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.962143 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8aa32c8-5b82-46ac-a7e2-7d6399474aff" containerName="registry-server" Feb 26 08:51:53 crc kubenswrapper[4925]: E0226 08:51:53.962235 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8aa32c8-5b82-46ac-a7e2-7d6399474aff" containerName="extract-utilities" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.962319 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8aa32c8-5b82-46ac-a7e2-7d6399474aff" containerName="extract-utilities" Feb 26 08:51:53 crc kubenswrapper[4925]: E0226 08:51:53.962397 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dabc4264-eb11-4a46-ad16-75e286b8f776" containerName="marketplace-operator" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.962493 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabc4264-eb11-4a46-ad16-75e286b8f776" containerName="marketplace-operator" Feb 26 08:51:53 crc kubenswrapper[4925]: E0226 08:51:53.962602 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce17c01b-f0a9-4f82-9512-028c8f673f85" containerName="registry-server" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.962725 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce17c01b-f0a9-4f82-9512-028c8f673f85" containerName="registry-server" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.962924 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="dabc4264-eb11-4a46-ad16-75e286b8f776" containerName="marketplace-operator" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.963024 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="75ccca07-d75b-4817-b203-931dc51638f4" containerName="registry-server" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.963159 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a8eea64-29d5-46bc-89db-23ad451eba4f" containerName="registry-server" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.963333 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="dabc4264-eb11-4a46-ad16-75e286b8f776" containerName="marketplace-operator" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.963425 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8aa32c8-5b82-46ac-a7e2-7d6399474aff" containerName="registry-server" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.963489 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce17c01b-f0a9-4f82-9512-028c8f673f85" containerName="registry-server" Feb 26 08:51:53 crc kubenswrapper[4925]: E0226 08:51:53.963657 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dabc4264-eb11-4a46-ad16-75e286b8f776" containerName="marketplace-operator" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.963728 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabc4264-eb11-4a46-ad16-75e286b8f776" containerName="marketplace-operator" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.964409 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzzc6" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.966263 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 26 08:51:53 crc kubenswrapper[4925]: I0226 08:51:53.968836 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzzc6"] Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.147655 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8aafadb1-ebf7-4af2-9766-5a93d4bda27a-utilities\") pod \"redhat-marketplace-gzzc6\" (UID: \"8aafadb1-ebf7-4af2-9766-5a93d4bda27a\") " pod="openshift-marketplace/redhat-marketplace-gzzc6" Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.147895 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8aafadb1-ebf7-4af2-9766-5a93d4bda27a-catalog-content\") pod \"redhat-marketplace-gzzc6\" (UID: \"8aafadb1-ebf7-4af2-9766-5a93d4bda27a\") " pod="openshift-marketplace/redhat-marketplace-gzzc6" Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.148010 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92lwz\" (UniqueName: \"kubernetes.io/projected/8aafadb1-ebf7-4af2-9766-5a93d4bda27a-kube-api-access-92lwz\") pod \"redhat-marketplace-gzzc6\" (UID: \"8aafadb1-ebf7-4af2-9766-5a93d4bda27a\") " pod="openshift-marketplace/redhat-marketplace-gzzc6" Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.265217 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8aafadb1-ebf7-4af2-9766-5a93d4bda27a-utilities\") pod \"redhat-marketplace-gzzc6\" (UID: \"8aafadb1-ebf7-4af2-9766-5a93d4bda27a\") " pod="openshift-marketplace/redhat-marketplace-gzzc6" Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.265270 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8aafadb1-ebf7-4af2-9766-5a93d4bda27a-catalog-content\") pod \"redhat-marketplace-gzzc6\" (UID: \"8aafadb1-ebf7-4af2-9766-5a93d4bda27a\") " pod="openshift-marketplace/redhat-marketplace-gzzc6" Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.265336 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92lwz\" (UniqueName: \"kubernetes.io/projected/8aafadb1-ebf7-4af2-9766-5a93d4bda27a-kube-api-access-92lwz\") pod \"redhat-marketplace-gzzc6\" (UID: \"8aafadb1-ebf7-4af2-9766-5a93d4bda27a\") " pod="openshift-marketplace/redhat-marketplace-gzzc6" Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.265792 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8aafadb1-ebf7-4af2-9766-5a93d4bda27a-catalog-content\") pod \"redhat-marketplace-gzzc6\" (UID: \"8aafadb1-ebf7-4af2-9766-5a93d4bda27a\") " pod="openshift-marketplace/redhat-marketplace-gzzc6" Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.265794 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8aafadb1-ebf7-4af2-9766-5a93d4bda27a-utilities\") pod \"redhat-marketplace-gzzc6\" (UID: \"8aafadb1-ebf7-4af2-9766-5a93d4bda27a\") " pod="openshift-marketplace/redhat-marketplace-gzzc6" Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.286485 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92lwz\" (UniqueName: \"kubernetes.io/projected/8aafadb1-ebf7-4af2-9766-5a93d4bda27a-kube-api-access-92lwz\") pod \"redhat-marketplace-gzzc6\" (UID: \"8aafadb1-ebf7-4af2-9766-5a93d4bda27a\") " pod="openshift-marketplace/redhat-marketplace-gzzc6" Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.293364 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzzc6" Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.466850 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzzc6"] Feb 26 08:51:54 crc kubenswrapper[4925]: W0226 08:51:54.470640 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8aafadb1_ebf7_4af2_9766_5a93d4bda27a.slice/crio-195c86915c4b3e6df20ae6cf36c584e84d91128ec21fde182207dada917c3b3a WatchSource:0}: Error finding container 195c86915c4b3e6df20ae6cf36c584e84d91128ec21fde182207dada917c3b3a: Status 404 returned error can't find the container with id 195c86915c4b3e6df20ae6cf36c584e84d91128ec21fde182207dada917c3b3a Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.516872 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75ccca07-d75b-4817-b203-931dc51638f4" path="/var/lib/kubelet/pods/75ccca07-d75b-4817-b203-931dc51638f4/volumes" Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.517472 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce17c01b-f0a9-4f82-9512-028c8f673f85" path="/var/lib/kubelet/pods/ce17c01b-f0a9-4f82-9512-028c8f673f85/volumes" Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.559502 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vvxd6"] Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.562280 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vvxd6" Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.564080 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.567579 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vvxd6"] Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.669975 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmwkw\" (UniqueName: \"kubernetes.io/projected/1362c0c3-0b1e-46d1-ad16-14a0514b0f6f-kube-api-access-qmwkw\") pod \"redhat-operators-vvxd6\" (UID: \"1362c0c3-0b1e-46d1-ad16-14a0514b0f6f\") " pod="openshift-marketplace/redhat-operators-vvxd6" Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.670107 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1362c0c3-0b1e-46d1-ad16-14a0514b0f6f-utilities\") pod \"redhat-operators-vvxd6\" (UID: \"1362c0c3-0b1e-46d1-ad16-14a0514b0f6f\") " pod="openshift-marketplace/redhat-operators-vvxd6" Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.670134 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1362c0c3-0b1e-46d1-ad16-14a0514b0f6f-catalog-content\") pod \"redhat-operators-vvxd6\" (UID: \"1362c0c3-0b1e-46d1-ad16-14a0514b0f6f\") " pod="openshift-marketplace/redhat-operators-vvxd6" Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.771401 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1362c0c3-0b1e-46d1-ad16-14a0514b0f6f-utilities\") pod \"redhat-operators-vvxd6\" (UID: \"1362c0c3-0b1e-46d1-ad16-14a0514b0f6f\") " pod="openshift-marketplace/redhat-operators-vvxd6" Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.771444 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1362c0c3-0b1e-46d1-ad16-14a0514b0f6f-catalog-content\") pod \"redhat-operators-vvxd6\" (UID: \"1362c0c3-0b1e-46d1-ad16-14a0514b0f6f\") " pod="openshift-marketplace/redhat-operators-vvxd6" Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.771501 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmwkw\" (UniqueName: \"kubernetes.io/projected/1362c0c3-0b1e-46d1-ad16-14a0514b0f6f-kube-api-access-qmwkw\") pod \"redhat-operators-vvxd6\" (UID: \"1362c0c3-0b1e-46d1-ad16-14a0514b0f6f\") " pod="openshift-marketplace/redhat-operators-vvxd6" Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.771971 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1362c0c3-0b1e-46d1-ad16-14a0514b0f6f-utilities\") pod \"redhat-operators-vvxd6\" (UID: \"1362c0c3-0b1e-46d1-ad16-14a0514b0f6f\") " pod="openshift-marketplace/redhat-operators-vvxd6" Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.772026 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1362c0c3-0b1e-46d1-ad16-14a0514b0f6f-catalog-content\") pod \"redhat-operators-vvxd6\" (UID: \"1362c0c3-0b1e-46d1-ad16-14a0514b0f6f\") " pod="openshift-marketplace/redhat-operators-vvxd6" Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.794364 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmwkw\" (UniqueName: \"kubernetes.io/projected/1362c0c3-0b1e-46d1-ad16-14a0514b0f6f-kube-api-access-qmwkw\") pod \"redhat-operators-vvxd6\" (UID: \"1362c0c3-0b1e-46d1-ad16-14a0514b0f6f\") " pod="openshift-marketplace/redhat-operators-vvxd6" Feb 26 08:51:54 crc kubenswrapper[4925]: I0226 08:51:54.893312 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vvxd6" Feb 26 08:51:55 crc kubenswrapper[4925]: I0226 08:51:55.097363 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vvxd6"] Feb 26 08:51:55 crc kubenswrapper[4925]: I0226 08:51:55.289387 4925 generic.go:334] "Generic (PLEG): container finished" podID="1362c0c3-0b1e-46d1-ad16-14a0514b0f6f" containerID="f825194899f8c89cad3677e3441257aa848ff2ebd5ea1b077dbd109257d66024" exitCode=0 Feb 26 08:51:55 crc kubenswrapper[4925]: I0226 08:51:55.289473 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vvxd6" event={"ID":"1362c0c3-0b1e-46d1-ad16-14a0514b0f6f","Type":"ContainerDied","Data":"f825194899f8c89cad3677e3441257aa848ff2ebd5ea1b077dbd109257d66024"} Feb 26 08:51:55 crc kubenswrapper[4925]: I0226 08:51:55.290096 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vvxd6" event={"ID":"1362c0c3-0b1e-46d1-ad16-14a0514b0f6f","Type":"ContainerStarted","Data":"7bc63988072c1f45e49b16eb9f489a6aabf320bb6700cbc9656ba2c91226467c"} Feb 26 08:51:55 crc kubenswrapper[4925]: I0226 08:51:55.294675 4925 generic.go:334] "Generic (PLEG): container finished" podID="8aafadb1-ebf7-4af2-9766-5a93d4bda27a" containerID="191b16e42898d7cc1eb41fa06782dc5cd2cc6a1b528bad47bbdfbb911512a83a" exitCode=0 Feb 26 08:51:55 crc kubenswrapper[4925]: I0226 08:51:55.294711 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzzc6" event={"ID":"8aafadb1-ebf7-4af2-9766-5a93d4bda27a","Type":"ContainerDied","Data":"191b16e42898d7cc1eb41fa06782dc5cd2cc6a1b528bad47bbdfbb911512a83a"} Feb 26 08:51:55 crc kubenswrapper[4925]: I0226 08:51:55.294754 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzzc6" event={"ID":"8aafadb1-ebf7-4af2-9766-5a93d4bda27a","Type":"ContainerStarted","Data":"195c86915c4b3e6df20ae6cf36c584e84d91128ec21fde182207dada917c3b3a"} Feb 26 08:51:56 crc kubenswrapper[4925]: I0226 08:51:56.358147 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sb7h5"] Feb 26 08:51:56 crc kubenswrapper[4925]: I0226 08:51:56.359471 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sb7h5" Feb 26 08:51:56 crc kubenswrapper[4925]: I0226 08:51:56.362133 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 26 08:51:56 crc kubenswrapper[4925]: I0226 08:51:56.363771 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sb7h5"] Feb 26 08:51:56 crc kubenswrapper[4925]: I0226 08:51:56.496589 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8adfff67-e5bf-4e74-a479-1136b9ff85c1-utilities\") pod \"community-operators-sb7h5\" (UID: \"8adfff67-e5bf-4e74-a479-1136b9ff85c1\") " pod="openshift-marketplace/community-operators-sb7h5" Feb 26 08:51:56 crc kubenswrapper[4925]: I0226 08:51:56.496912 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8adfff67-e5bf-4e74-a479-1136b9ff85c1-catalog-content\") pod \"community-operators-sb7h5\" (UID: \"8adfff67-e5bf-4e74-a479-1136b9ff85c1\") " pod="openshift-marketplace/community-operators-sb7h5" Feb 26 08:51:56 crc kubenswrapper[4925]: I0226 08:51:56.496961 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d9kr\" (UniqueName: \"kubernetes.io/projected/8adfff67-e5bf-4e74-a479-1136b9ff85c1-kube-api-access-6d9kr\") pod \"community-operators-sb7h5\" (UID: \"8adfff67-e5bf-4e74-a479-1136b9ff85c1\") " pod="openshift-marketplace/community-operators-sb7h5" Feb 26 08:51:56 crc kubenswrapper[4925]: I0226 08:51:56.598309 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8adfff67-e5bf-4e74-a479-1136b9ff85c1-utilities\") pod \"community-operators-sb7h5\" (UID: \"8adfff67-e5bf-4e74-a479-1136b9ff85c1\") " pod="openshift-marketplace/community-operators-sb7h5" Feb 26 08:51:56 crc kubenswrapper[4925]: I0226 08:51:56.598369 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8adfff67-e5bf-4e74-a479-1136b9ff85c1-catalog-content\") pod \"community-operators-sb7h5\" (UID: \"8adfff67-e5bf-4e74-a479-1136b9ff85c1\") " pod="openshift-marketplace/community-operators-sb7h5" Feb 26 08:51:56 crc kubenswrapper[4925]: I0226 08:51:56.598409 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d9kr\" (UniqueName: \"kubernetes.io/projected/8adfff67-e5bf-4e74-a479-1136b9ff85c1-kube-api-access-6d9kr\") pod \"community-operators-sb7h5\" (UID: \"8adfff67-e5bf-4e74-a479-1136b9ff85c1\") " pod="openshift-marketplace/community-operators-sb7h5" Feb 26 08:51:56 crc kubenswrapper[4925]: I0226 08:51:56.598883 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8adfff67-e5bf-4e74-a479-1136b9ff85c1-catalog-content\") pod \"community-operators-sb7h5\" (UID: \"8adfff67-e5bf-4e74-a479-1136b9ff85c1\") " pod="openshift-marketplace/community-operators-sb7h5" Feb 26 08:51:56 crc kubenswrapper[4925]: I0226 08:51:56.598899 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8adfff67-e5bf-4e74-a479-1136b9ff85c1-utilities\") pod \"community-operators-sb7h5\" (UID: \"8adfff67-e5bf-4e74-a479-1136b9ff85c1\") " pod="openshift-marketplace/community-operators-sb7h5" Feb 26 08:51:56 crc kubenswrapper[4925]: I0226 08:51:56.617336 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d9kr\" (UniqueName: \"kubernetes.io/projected/8adfff67-e5bf-4e74-a479-1136b9ff85c1-kube-api-access-6d9kr\") pod \"community-operators-sb7h5\" (UID: \"8adfff67-e5bf-4e74-a479-1136b9ff85c1\") " pod="openshift-marketplace/community-operators-sb7h5" Feb 26 08:51:56 crc kubenswrapper[4925]: I0226 08:51:56.686777 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sb7h5" Feb 26 08:51:56 crc kubenswrapper[4925]: I0226 08:51:56.884188 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sb7h5"] Feb 26 08:51:56 crc kubenswrapper[4925]: W0226 08:51:56.891863 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8adfff67_e5bf_4e74_a479_1136b9ff85c1.slice/crio-c7db7681884b2692a3784ac3304b42e65264230e715581efa3a2a52b1ea987f7 WatchSource:0}: Error finding container c7db7681884b2692a3784ac3304b42e65264230e715581efa3a2a52b1ea987f7: Status 404 returned error can't find the container with id c7db7681884b2692a3784ac3304b42e65264230e715581efa3a2a52b1ea987f7 Feb 26 08:51:56 crc kubenswrapper[4925]: I0226 08:51:56.956168 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hbkc7"] Feb 26 08:51:56 crc kubenswrapper[4925]: I0226 08:51:56.957256 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hbkc7" Feb 26 08:51:56 crc kubenswrapper[4925]: I0226 08:51:56.962726 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 26 08:51:56 crc kubenswrapper[4925]: I0226 08:51:56.968974 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hbkc7"] Feb 26 08:51:57 crc kubenswrapper[4925]: I0226 08:51:57.105875 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a918d8d-5591-4766-b734-1f8988eab653-utilities\") pod \"certified-operators-hbkc7\" (UID: \"1a918d8d-5591-4766-b734-1f8988eab653\") " pod="openshift-marketplace/certified-operators-hbkc7" Feb 26 08:51:57 crc kubenswrapper[4925]: I0226 08:51:57.106383 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a918d8d-5591-4766-b734-1f8988eab653-catalog-content\") pod \"certified-operators-hbkc7\" (UID: \"1a918d8d-5591-4766-b734-1f8988eab653\") " pod="openshift-marketplace/certified-operators-hbkc7" Feb 26 08:51:57 crc kubenswrapper[4925]: I0226 08:51:57.106505 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sksh\" (UniqueName: \"kubernetes.io/projected/1a918d8d-5591-4766-b734-1f8988eab653-kube-api-access-8sksh\") pod \"certified-operators-hbkc7\" (UID: \"1a918d8d-5591-4766-b734-1f8988eab653\") " pod="openshift-marketplace/certified-operators-hbkc7" Feb 26 08:51:57 crc kubenswrapper[4925]: I0226 08:51:57.207955 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sksh\" (UniqueName: \"kubernetes.io/projected/1a918d8d-5591-4766-b734-1f8988eab653-kube-api-access-8sksh\") pod \"certified-operators-hbkc7\" (UID: \"1a918d8d-5591-4766-b734-1f8988eab653\") " pod="openshift-marketplace/certified-operators-hbkc7" Feb 26 08:51:57 crc kubenswrapper[4925]: I0226 08:51:57.208015 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a918d8d-5591-4766-b734-1f8988eab653-utilities\") pod \"certified-operators-hbkc7\" (UID: \"1a918d8d-5591-4766-b734-1f8988eab653\") " pod="openshift-marketplace/certified-operators-hbkc7" Feb 26 08:51:57 crc kubenswrapper[4925]: I0226 08:51:57.208050 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a918d8d-5591-4766-b734-1f8988eab653-catalog-content\") pod \"certified-operators-hbkc7\" (UID: \"1a918d8d-5591-4766-b734-1f8988eab653\") " pod="openshift-marketplace/certified-operators-hbkc7" Feb 26 08:51:57 crc kubenswrapper[4925]: I0226 08:51:57.208546 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a918d8d-5591-4766-b734-1f8988eab653-catalog-content\") pod \"certified-operators-hbkc7\" (UID: \"1a918d8d-5591-4766-b734-1f8988eab653\") " pod="openshift-marketplace/certified-operators-hbkc7" Feb 26 08:51:57 crc kubenswrapper[4925]: I0226 08:51:57.209025 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a918d8d-5591-4766-b734-1f8988eab653-utilities\") pod \"certified-operators-hbkc7\" (UID: \"1a918d8d-5591-4766-b734-1f8988eab653\") " pod="openshift-marketplace/certified-operators-hbkc7" Feb 26 08:51:57 crc kubenswrapper[4925]: I0226 08:51:57.229329 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sksh\" (UniqueName: \"kubernetes.io/projected/1a918d8d-5591-4766-b734-1f8988eab653-kube-api-access-8sksh\") pod \"certified-operators-hbkc7\" (UID: \"1a918d8d-5591-4766-b734-1f8988eab653\") " pod="openshift-marketplace/certified-operators-hbkc7" Feb 26 08:51:57 crc kubenswrapper[4925]: I0226 08:51:57.287142 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hbkc7" Feb 26 08:51:57 crc kubenswrapper[4925]: I0226 08:51:57.306225 4925 generic.go:334] "Generic (PLEG): container finished" podID="8aafadb1-ebf7-4af2-9766-5a93d4bda27a" containerID="62bf41eead09ebd8bdd018e02510c2d3a3a741e28ee585f2a7c4ec4a31a5792e" exitCode=0 Feb 26 08:51:57 crc kubenswrapper[4925]: I0226 08:51:57.306285 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzzc6" event={"ID":"8aafadb1-ebf7-4af2-9766-5a93d4bda27a","Type":"ContainerDied","Data":"62bf41eead09ebd8bdd018e02510c2d3a3a741e28ee585f2a7c4ec4a31a5792e"} Feb 26 08:51:57 crc kubenswrapper[4925]: I0226 08:51:57.309355 4925 generic.go:334] "Generic (PLEG): container finished" podID="8adfff67-e5bf-4e74-a479-1136b9ff85c1" containerID="cdd5a86868eb9380408a3ebc24c59ec09d52b3fda0ff5778a0978dbaa35285c0" exitCode=0 Feb 26 08:51:57 crc kubenswrapper[4925]: I0226 08:51:57.310040 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb7h5" event={"ID":"8adfff67-e5bf-4e74-a479-1136b9ff85c1","Type":"ContainerDied","Data":"cdd5a86868eb9380408a3ebc24c59ec09d52b3fda0ff5778a0978dbaa35285c0"} Feb 26 08:51:57 crc kubenswrapper[4925]: I0226 08:51:57.310086 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb7h5" event={"ID":"8adfff67-e5bf-4e74-a479-1136b9ff85c1","Type":"ContainerStarted","Data":"c7db7681884b2692a3784ac3304b42e65264230e715581efa3a2a52b1ea987f7"} Feb 26 08:51:57 crc kubenswrapper[4925]: I0226 08:51:57.316080 4925 generic.go:334] "Generic (PLEG): container finished" podID="1362c0c3-0b1e-46d1-ad16-14a0514b0f6f" containerID="f5cd258736e55b29565813e10dd94fbe0b20997f3d7d3a137b8a5f378ebc52a5" exitCode=0 Feb 26 08:51:57 crc kubenswrapper[4925]: I0226 08:51:57.316127 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vvxd6" event={"ID":"1362c0c3-0b1e-46d1-ad16-14a0514b0f6f","Type":"ContainerDied","Data":"f5cd258736e55b29565813e10dd94fbe0b20997f3d7d3a137b8a5f378ebc52a5"} Feb 26 08:51:57 crc kubenswrapper[4925]: I0226 08:51:57.462090 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hbkc7"] Feb 26 08:51:58 crc kubenswrapper[4925]: I0226 08:51:58.322576 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb7h5" event={"ID":"8adfff67-e5bf-4e74-a479-1136b9ff85c1","Type":"ContainerStarted","Data":"c9dc53fe28b855ed7af7feb04f17264d3340fcf77793116bb8fa4557251815b3"} Feb 26 08:51:58 crc kubenswrapper[4925]: I0226 08:51:58.325689 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vvxd6" event={"ID":"1362c0c3-0b1e-46d1-ad16-14a0514b0f6f","Type":"ContainerStarted","Data":"b177d2216314155786b99d57af963c0b20e4bcae70b7409c9c6e4d5b84589d76"} Feb 26 08:51:58 crc kubenswrapper[4925]: I0226 08:51:58.327503 4925 generic.go:334] "Generic (PLEG): container finished" podID="1a918d8d-5591-4766-b734-1f8988eab653" containerID="3039b7292a1a354f3e2a34d57b17e1cdf018546ea29bd3b40fa655473dc0c1e6" exitCode=0 Feb 26 08:51:58 crc kubenswrapper[4925]: I0226 08:51:58.327563 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hbkc7" event={"ID":"1a918d8d-5591-4766-b734-1f8988eab653","Type":"ContainerDied","Data":"3039b7292a1a354f3e2a34d57b17e1cdf018546ea29bd3b40fa655473dc0c1e6"} Feb 26 08:51:58 crc kubenswrapper[4925]: I0226 08:51:58.327580 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hbkc7" event={"ID":"1a918d8d-5591-4766-b734-1f8988eab653","Type":"ContainerStarted","Data":"3c2a794d61e9fa3aec4fe1b3641344a0b3b992a7454cc2a9599fd346701a5987"} Feb 26 08:51:58 crc kubenswrapper[4925]: I0226 08:51:58.331318 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzzc6" event={"ID":"8aafadb1-ebf7-4af2-9766-5a93d4bda27a","Type":"ContainerStarted","Data":"c075ee159ebb28dbb09e5f0129b5d1316561026c194db23b47d399965b6f2226"} Feb 26 08:51:58 crc kubenswrapper[4925]: I0226 08:51:58.367169 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vvxd6" podStartSLOduration=1.764829233 podStartE2EDuration="4.367150295s" podCreationTimestamp="2026-02-26 08:51:54 +0000 UTC" firstStartedPulling="2026-02-26 08:51:55.291103477 +0000 UTC m=+447.430677760" lastFinishedPulling="2026-02-26 08:51:57.893424539 +0000 UTC m=+450.032998822" observedRunningTime="2026-02-26 08:51:58.34982248 +0000 UTC m=+450.489396763" watchObservedRunningTime="2026-02-26 08:51:58.367150295 +0000 UTC m=+450.506724578" Feb 26 08:51:58 crc kubenswrapper[4925]: I0226 08:51:58.371700 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gzzc6" podStartSLOduration=2.80601325 podStartE2EDuration="5.371684961s" podCreationTimestamp="2026-02-26 08:51:53 +0000 UTC" firstStartedPulling="2026-02-26 08:51:55.296268029 +0000 UTC m=+447.435842312" lastFinishedPulling="2026-02-26 08:51:57.86193974 +0000 UTC m=+450.001514023" observedRunningTime="2026-02-26 08:51:58.364576909 +0000 UTC m=+450.504151192" watchObservedRunningTime="2026-02-26 08:51:58.371684961 +0000 UTC m=+450.511259244" Feb 26 08:51:59 crc kubenswrapper[4925]: I0226 08:51:59.339406 4925 generic.go:334] "Generic (PLEG): container finished" podID="8adfff67-e5bf-4e74-a479-1136b9ff85c1" containerID="c9dc53fe28b855ed7af7feb04f17264d3340fcf77793116bb8fa4557251815b3" exitCode=0 Feb 26 08:51:59 crc kubenswrapper[4925]: I0226 08:51:59.339610 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb7h5" event={"ID":"8adfff67-e5bf-4e74-a479-1136b9ff85c1","Type":"ContainerDied","Data":"c9dc53fe28b855ed7af7feb04f17264d3340fcf77793116bb8fa4557251815b3"} Feb 26 08:52:00 crc kubenswrapper[4925]: I0226 08:52:00.132105 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29534932-6n2fs"] Feb 26 08:52:00 crc kubenswrapper[4925]: I0226 08:52:00.133128 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534932-6n2fs" Feb 26 08:52:00 crc kubenswrapper[4925]: I0226 08:52:00.138489 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 08:52:00 crc kubenswrapper[4925]: I0226 08:52:00.138758 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 08:52:00 crc kubenswrapper[4925]: I0226 08:52:00.139056 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zdfj4" Feb 26 08:52:00 crc kubenswrapper[4925]: I0226 08:52:00.148310 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29534932-6n2fs"] Feb 26 08:52:00 crc kubenswrapper[4925]: I0226 08:52:00.245872 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp6v4\" (UniqueName: \"kubernetes.io/projected/781a9125-5d20-412d-bf5b-5112d497934e-kube-api-access-qp6v4\") pod \"auto-csr-approver-29534932-6n2fs\" (UID: \"781a9125-5d20-412d-bf5b-5112d497934e\") " pod="openshift-infra/auto-csr-approver-29534932-6n2fs" Feb 26 08:52:00 crc kubenswrapper[4925]: I0226 08:52:00.346691 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp6v4\" (UniqueName: \"kubernetes.io/projected/781a9125-5d20-412d-bf5b-5112d497934e-kube-api-access-qp6v4\") pod \"auto-csr-approver-29534932-6n2fs\" (UID: \"781a9125-5d20-412d-bf5b-5112d497934e\") " pod="openshift-infra/auto-csr-approver-29534932-6n2fs" Feb 26 08:52:00 crc kubenswrapper[4925]: I0226 08:52:00.346857 4925 generic.go:334] "Generic (PLEG): container finished" podID="1a918d8d-5591-4766-b734-1f8988eab653" containerID="739b29f898818fbdf3e37c7c599e25a1ced4ff317076880236439b906554aae6" exitCode=0 Feb 26 08:52:00 crc kubenswrapper[4925]: I0226 08:52:00.346884 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hbkc7" event={"ID":"1a918d8d-5591-4766-b734-1f8988eab653","Type":"ContainerDied","Data":"739b29f898818fbdf3e37c7c599e25a1ced4ff317076880236439b906554aae6"} Feb 26 08:52:00 crc kubenswrapper[4925]: I0226 08:52:00.369100 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp6v4\" (UniqueName: \"kubernetes.io/projected/781a9125-5d20-412d-bf5b-5112d497934e-kube-api-access-qp6v4\") pod \"auto-csr-approver-29534932-6n2fs\" (UID: \"781a9125-5d20-412d-bf5b-5112d497934e\") " pod="openshift-infra/auto-csr-approver-29534932-6n2fs" Feb 26 08:52:00 crc kubenswrapper[4925]: I0226 08:52:00.464640 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534932-6n2fs" Feb 26 08:52:00 crc kubenswrapper[4925]: I0226 08:52:00.654954 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29534932-6n2fs"] Feb 26 08:52:00 crc kubenswrapper[4925]: W0226 08:52:00.660122 4925 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod781a9125_5d20_412d_bf5b_5112d497934e.slice/crio-3ccc86ed881eac253d739dce33f8f50c304e634bc0d6564b28c3722406cc51c5 WatchSource:0}: Error finding container 3ccc86ed881eac253d739dce33f8f50c304e634bc0d6564b28c3722406cc51c5: Status 404 returned error can't find the container with id 3ccc86ed881eac253d739dce33f8f50c304e634bc0d6564b28c3722406cc51c5 Feb 26 08:52:01 crc kubenswrapper[4925]: I0226 08:52:01.353033 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534932-6n2fs" event={"ID":"781a9125-5d20-412d-bf5b-5112d497934e","Type":"ContainerStarted","Data":"3ccc86ed881eac253d739dce33f8f50c304e634bc0d6564b28c3722406cc51c5"} Feb 26 08:52:01 crc kubenswrapper[4925]: I0226 08:52:01.355033 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sb7h5" event={"ID":"8adfff67-e5bf-4e74-a479-1136b9ff85c1","Type":"ContainerStarted","Data":"d0bc9da4f3375cb863d0d8527d6b8283fd4fdaa3e495d0127a5d95c738291d56"} Feb 26 08:52:01 crc kubenswrapper[4925]: I0226 08:52:01.357011 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hbkc7" event={"ID":"1a918d8d-5591-4766-b734-1f8988eab653","Type":"ContainerStarted","Data":"29c86021c9f32b2a5cf537fcaad3e4d2267dc565d9ee9b654e4de1ba01933c41"} Feb 26 08:52:01 crc kubenswrapper[4925]: I0226 08:52:01.373992 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sb7h5" podStartSLOduration=2.466360585 podStartE2EDuration="5.373971204s" podCreationTimestamp="2026-02-26 08:51:56 +0000 UTC" firstStartedPulling="2026-02-26 08:51:57.311294638 +0000 UTC m=+449.450868931" lastFinishedPulling="2026-02-26 08:52:00.218905227 +0000 UTC m=+452.358479550" observedRunningTime="2026-02-26 08:52:01.372987009 +0000 UTC m=+453.512561292" watchObservedRunningTime="2026-02-26 08:52:01.373971204 +0000 UTC m=+453.513545497" Feb 26 08:52:01 crc kubenswrapper[4925]: I0226 08:52:01.397960 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hbkc7" podStartSLOduration=2.89065077 podStartE2EDuration="5.39793999s" podCreationTimestamp="2026-02-26 08:51:56 +0000 UTC" firstStartedPulling="2026-02-26 08:51:58.32999615 +0000 UTC m=+450.469570433" lastFinishedPulling="2026-02-26 08:52:00.83728537 +0000 UTC m=+452.976859653" observedRunningTime="2026-02-26 08:52:01.395009585 +0000 UTC m=+453.534583868" watchObservedRunningTime="2026-02-26 08:52:01.39793999 +0000 UTC m=+453.537514273" Feb 26 08:52:02 crc kubenswrapper[4925]: I0226 08:52:02.363625 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534932-6n2fs" event={"ID":"781a9125-5d20-412d-bf5b-5112d497934e","Type":"ContainerStarted","Data":"c3c1e6e0485af09dbc600d275010c16a9c8db155df96de118eb71c1bb7070259"} Feb 26 08:52:02 crc kubenswrapper[4925]: I0226 08:52:02.378240 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29534932-6n2fs" podStartSLOduration=1.434721316 podStartE2EDuration="2.378222585s" podCreationTimestamp="2026-02-26 08:52:00 +0000 UTC" firstStartedPulling="2026-02-26 08:52:00.662246782 +0000 UTC m=+452.801821065" lastFinishedPulling="2026-02-26 08:52:01.605748051 +0000 UTC m=+453.745322334" observedRunningTime="2026-02-26 08:52:02.376743457 +0000 UTC m=+454.516317740" watchObservedRunningTime="2026-02-26 08:52:02.378222585 +0000 UTC m=+454.517796868" Feb 26 08:52:03 crc kubenswrapper[4925]: I0226 08:52:03.369037 4925 generic.go:334] "Generic (PLEG): container finished" podID="781a9125-5d20-412d-bf5b-5112d497934e" containerID="c3c1e6e0485af09dbc600d275010c16a9c8db155df96de118eb71c1bb7070259" exitCode=0 Feb 26 08:52:03 crc kubenswrapper[4925]: I0226 08:52:03.369093 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534932-6n2fs" event={"ID":"781a9125-5d20-412d-bf5b-5112d497934e","Type":"ContainerDied","Data":"c3c1e6e0485af09dbc600d275010c16a9c8db155df96de118eb71c1bb7070259"} Feb 26 08:52:04 crc kubenswrapper[4925]: I0226 08:52:04.294083 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gzzc6" Feb 26 08:52:04 crc kubenswrapper[4925]: I0226 08:52:04.294504 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gzzc6" Feb 26 08:52:04 crc kubenswrapper[4925]: I0226 08:52:04.335236 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gzzc6" Feb 26 08:52:04 crc kubenswrapper[4925]: I0226 08:52:04.410508 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gzzc6" Feb 26 08:52:04 crc kubenswrapper[4925]: I0226 08:52:04.588343 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534932-6n2fs" Feb 26 08:52:04 crc kubenswrapper[4925]: I0226 08:52:04.701944 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp6v4\" (UniqueName: \"kubernetes.io/projected/781a9125-5d20-412d-bf5b-5112d497934e-kube-api-access-qp6v4\") pod \"781a9125-5d20-412d-bf5b-5112d497934e\" (UID: \"781a9125-5d20-412d-bf5b-5112d497934e\") " Feb 26 08:52:04 crc kubenswrapper[4925]: I0226 08:52:04.710756 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/781a9125-5d20-412d-bf5b-5112d497934e-kube-api-access-qp6v4" (OuterVolumeSpecName: "kube-api-access-qp6v4") pod "781a9125-5d20-412d-bf5b-5112d497934e" (UID: "781a9125-5d20-412d-bf5b-5112d497934e"). InnerVolumeSpecName "kube-api-access-qp6v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:52:04 crc kubenswrapper[4925]: I0226 08:52:04.803243 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp6v4\" (UniqueName: \"kubernetes.io/projected/781a9125-5d20-412d-bf5b-5112d497934e-kube-api-access-qp6v4\") on node \"crc\" DevicePath \"\"" Feb 26 08:52:04 crc kubenswrapper[4925]: I0226 08:52:04.894090 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vvxd6" Feb 26 08:52:04 crc kubenswrapper[4925]: I0226 08:52:04.894163 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vvxd6" Feb 26 08:52:04 crc kubenswrapper[4925]: I0226 08:52:04.934604 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vvxd6" Feb 26 08:52:05 crc kubenswrapper[4925]: I0226 08:52:05.382677 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534932-6n2fs" event={"ID":"781a9125-5d20-412d-bf5b-5112d497934e","Type":"ContainerDied","Data":"3ccc86ed881eac253d739dce33f8f50c304e634bc0d6564b28c3722406cc51c5"} Feb 26 08:52:05 crc kubenswrapper[4925]: I0226 08:52:05.382996 4925 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ccc86ed881eac253d739dce33f8f50c304e634bc0d6564b28c3722406cc51c5" Feb 26 08:52:05 crc kubenswrapper[4925]: I0226 08:52:05.382813 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534932-6n2fs" Feb 26 08:52:05 crc kubenswrapper[4925]: I0226 08:52:05.423403 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29534926-k666h"] Feb 26 08:52:05 crc kubenswrapper[4925]: I0226 08:52:05.427598 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29534926-k666h"] Feb 26 08:52:05 crc kubenswrapper[4925]: I0226 08:52:05.427745 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vvxd6" Feb 26 08:52:06 crc kubenswrapper[4925]: I0226 08:52:06.512902 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e938a2d-afc6-4360-aa50-e93c71d2c8c3" path="/var/lib/kubelet/pods/0e938a2d-afc6-4360-aa50-e93c71d2c8c3/volumes" Feb 26 08:52:06 crc kubenswrapper[4925]: I0226 08:52:06.687915 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sb7h5" Feb 26 08:52:06 crc kubenswrapper[4925]: I0226 08:52:06.687974 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sb7h5" Feb 26 08:52:06 crc kubenswrapper[4925]: I0226 08:52:06.726751 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sb7h5" Feb 26 08:52:07 crc kubenswrapper[4925]: I0226 08:52:07.288130 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hbkc7" Feb 26 08:52:07 crc kubenswrapper[4925]: I0226 08:52:07.288399 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hbkc7" Feb 26 08:52:07 crc kubenswrapper[4925]: I0226 08:52:07.329885 4925 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hbkc7" Feb 26 08:52:07 crc kubenswrapper[4925]: I0226 08:52:07.431909 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sb7h5" Feb 26 08:52:07 crc kubenswrapper[4925]: I0226 08:52:07.433398 4925 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hbkc7" Feb 26 08:52:15 crc kubenswrapper[4925]: I0226 08:52:15.871834 4925 patch_prober.go:28] interesting pod/machine-config-daemon-s6pcl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 08:52:15 crc kubenswrapper[4925]: I0226 08:52:15.872661 4925 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" podUID="52bdf0f6-9afd-42fe-9a30-7da82bc7f619" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 08:52:45 crc kubenswrapper[4925]: I0226 08:52:45.871809 4925 patch_prober.go:28] interesting pod/machine-config-daemon-s6pcl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 08:52:45 crc kubenswrapper[4925]: I0226 08:52:45.872352 4925 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" podUID="52bdf0f6-9afd-42fe-9a30-7da82bc7f619" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 08:52:45 crc kubenswrapper[4925]: I0226 08:52:45.872415 4925 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" Feb 26 08:52:45 crc kubenswrapper[4925]: I0226 08:52:45.873190 4925 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0b9902dc2e31bf3f83affdad185ade7bc019efcdf1759a693c00f03b44bf4d0a"} pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 08:52:45 crc kubenswrapper[4925]: I0226 08:52:45.873285 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" podUID="52bdf0f6-9afd-42fe-9a30-7da82bc7f619" containerName="machine-config-daemon" containerID="cri-o://0b9902dc2e31bf3f83affdad185ade7bc019efcdf1759a693c00f03b44bf4d0a" gracePeriod=600 Feb 26 08:52:46 crc kubenswrapper[4925]: I0226 08:52:46.046354 4925 generic.go:334] "Generic (PLEG): container finished" podID="52bdf0f6-9afd-42fe-9a30-7da82bc7f619" containerID="0b9902dc2e31bf3f83affdad185ade7bc019efcdf1759a693c00f03b44bf4d0a" exitCode=0 Feb 26 08:52:46 crc kubenswrapper[4925]: I0226 08:52:46.046395 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" event={"ID":"52bdf0f6-9afd-42fe-9a30-7da82bc7f619","Type":"ContainerDied","Data":"0b9902dc2e31bf3f83affdad185ade7bc019efcdf1759a693c00f03b44bf4d0a"} Feb 26 08:52:46 crc kubenswrapper[4925]: I0226 08:52:46.046449 4925 scope.go:117] "RemoveContainer" containerID="767cde5977703f5ed565c08a4502ebf2aabd1863ff5e7f29c6d18e2e04550a1c" Feb 26 08:52:47 crc kubenswrapper[4925]: I0226 08:52:47.055303 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" event={"ID":"52bdf0f6-9afd-42fe-9a30-7da82bc7f619","Type":"ContainerStarted","Data":"e6128aecea95bb948b242d988728af1e33ed31b904f9a249c13c82124a61d7d2"} Feb 26 08:54:00 crc kubenswrapper[4925]: I0226 08:54:00.137365 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29534934-2nw7z"] Feb 26 08:54:00 crc kubenswrapper[4925]: E0226 08:54:00.139866 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="781a9125-5d20-412d-bf5b-5112d497934e" containerName="oc" Feb 26 08:54:00 crc kubenswrapper[4925]: I0226 08:54:00.139892 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="781a9125-5d20-412d-bf5b-5112d497934e" containerName="oc" Feb 26 08:54:00 crc kubenswrapper[4925]: I0226 08:54:00.140079 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="781a9125-5d20-412d-bf5b-5112d497934e" containerName="oc" Feb 26 08:54:00 crc kubenswrapper[4925]: I0226 08:54:00.140679 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534934-2nw7z" Feb 26 08:54:00 crc kubenswrapper[4925]: I0226 08:54:00.144861 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29534934-2nw7z"] Feb 26 08:54:00 crc kubenswrapper[4925]: I0226 08:54:00.185732 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 08:54:00 crc kubenswrapper[4925]: I0226 08:54:00.185790 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 08:54:00 crc kubenswrapper[4925]: I0226 08:54:00.185981 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zdfj4" Feb 26 08:54:00 crc kubenswrapper[4925]: I0226 08:54:00.303772 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl57g\" (UniqueName: \"kubernetes.io/projected/4cfa6b4d-8f11-4a07-820d-3aadcec92425-kube-api-access-bl57g\") pod \"auto-csr-approver-29534934-2nw7z\" (UID: \"4cfa6b4d-8f11-4a07-820d-3aadcec92425\") " pod="openshift-infra/auto-csr-approver-29534934-2nw7z" Feb 26 08:54:00 crc kubenswrapper[4925]: I0226 08:54:00.405786 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl57g\" (UniqueName: \"kubernetes.io/projected/4cfa6b4d-8f11-4a07-820d-3aadcec92425-kube-api-access-bl57g\") pod \"auto-csr-approver-29534934-2nw7z\" (UID: \"4cfa6b4d-8f11-4a07-820d-3aadcec92425\") " pod="openshift-infra/auto-csr-approver-29534934-2nw7z" Feb 26 08:54:00 crc kubenswrapper[4925]: I0226 08:54:00.435301 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl57g\" (UniqueName: \"kubernetes.io/projected/4cfa6b4d-8f11-4a07-820d-3aadcec92425-kube-api-access-bl57g\") pod \"auto-csr-approver-29534934-2nw7z\" (UID: \"4cfa6b4d-8f11-4a07-820d-3aadcec92425\") " pod="openshift-infra/auto-csr-approver-29534934-2nw7z" Feb 26 08:54:00 crc kubenswrapper[4925]: I0226 08:54:00.506012 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534934-2nw7z" Feb 26 08:54:00 crc kubenswrapper[4925]: I0226 08:54:00.954667 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29534934-2nw7z"] Feb 26 08:54:00 crc kubenswrapper[4925]: I0226 08:54:00.968701 4925 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 08:54:01 crc kubenswrapper[4925]: I0226 08:54:01.507291 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534934-2nw7z" event={"ID":"4cfa6b4d-8f11-4a07-820d-3aadcec92425","Type":"ContainerStarted","Data":"9ffe97b73c2f5b74361308ff10ad1f8422fe724f997a8717de2f835193597298"} Feb 26 08:54:02 crc kubenswrapper[4925]: I0226 08:54:02.514615 4925 generic.go:334] "Generic (PLEG): container finished" podID="4cfa6b4d-8f11-4a07-820d-3aadcec92425" containerID="4f2a39ed5d8af8197a54f0e6952758d78a80a5f9e5e5f79f724a8f58db986877" exitCode=0 Feb 26 08:54:02 crc kubenswrapper[4925]: I0226 08:54:02.514690 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534934-2nw7z" event={"ID":"4cfa6b4d-8f11-4a07-820d-3aadcec92425","Type":"ContainerDied","Data":"4f2a39ed5d8af8197a54f0e6952758d78a80a5f9e5e5f79f724a8f58db986877"} Feb 26 08:54:03 crc kubenswrapper[4925]: I0226 08:54:03.834385 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534934-2nw7z" Feb 26 08:54:03 crc kubenswrapper[4925]: I0226 08:54:03.949434 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl57g\" (UniqueName: \"kubernetes.io/projected/4cfa6b4d-8f11-4a07-820d-3aadcec92425-kube-api-access-bl57g\") pod \"4cfa6b4d-8f11-4a07-820d-3aadcec92425\" (UID: \"4cfa6b4d-8f11-4a07-820d-3aadcec92425\") " Feb 26 08:54:03 crc kubenswrapper[4925]: I0226 08:54:03.955697 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cfa6b4d-8f11-4a07-820d-3aadcec92425-kube-api-access-bl57g" (OuterVolumeSpecName: "kube-api-access-bl57g") pod "4cfa6b4d-8f11-4a07-820d-3aadcec92425" (UID: "4cfa6b4d-8f11-4a07-820d-3aadcec92425"). InnerVolumeSpecName "kube-api-access-bl57g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:54:04 crc kubenswrapper[4925]: I0226 08:54:04.052298 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl57g\" (UniqueName: \"kubernetes.io/projected/4cfa6b4d-8f11-4a07-820d-3aadcec92425-kube-api-access-bl57g\") on node \"crc\" DevicePath \"\"" Feb 26 08:54:04 crc kubenswrapper[4925]: I0226 08:54:04.528942 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534934-2nw7z" event={"ID":"4cfa6b4d-8f11-4a07-820d-3aadcec92425","Type":"ContainerDied","Data":"9ffe97b73c2f5b74361308ff10ad1f8422fe724f997a8717de2f835193597298"} Feb 26 08:54:04 crc kubenswrapper[4925]: I0226 08:54:04.528999 4925 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ffe97b73c2f5b74361308ff10ad1f8422fe724f997a8717de2f835193597298" Feb 26 08:54:04 crc kubenswrapper[4925]: I0226 08:54:04.528967 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534934-2nw7z" Feb 26 08:54:04 crc kubenswrapper[4925]: I0226 08:54:04.897876 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29534928-pxjft"] Feb 26 08:54:04 crc kubenswrapper[4925]: I0226 08:54:04.901086 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29534928-pxjft"] Feb 26 08:54:06 crc kubenswrapper[4925]: I0226 08:54:06.515949 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2a81774-e811-40f9-a623-c50a6bcd5d7b" path="/var/lib/kubelet/pods/b2a81774-e811-40f9-a623-c50a6bcd5d7b/volumes" Feb 26 08:55:15 crc kubenswrapper[4925]: I0226 08:55:15.872851 4925 patch_prober.go:28] interesting pod/machine-config-daemon-s6pcl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 08:55:15 crc kubenswrapper[4925]: I0226 08:55:15.873724 4925 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" podUID="52bdf0f6-9afd-42fe-9a30-7da82bc7f619" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 08:55:45 crc kubenswrapper[4925]: I0226 08:55:45.872676 4925 patch_prober.go:28] interesting pod/machine-config-daemon-s6pcl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 08:55:45 crc kubenswrapper[4925]: I0226 08:55:45.873227 4925 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" podUID="52bdf0f6-9afd-42fe-9a30-7da82bc7f619" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 08:55:49 crc kubenswrapper[4925]: I0226 08:55:49.026745 4925 scope.go:117] "RemoveContainer" containerID="f673412f2df1db5308a26c13aaeb1004d0994fcbfd0f318b95e23c3cb6a2a3d6" Feb 26 08:55:49 crc kubenswrapper[4925]: I0226 08:55:49.072349 4925 scope.go:117] "RemoveContainer" containerID="b9db05ef4ecc6cb32dd117435afee55e2a6325ea6e1b0cb07e94963b825794ee" Feb 26 08:55:49 crc kubenswrapper[4925]: I0226 08:55:49.122705 4925 scope.go:117] "RemoveContainer" containerID="c3e875887d3713fbbe41e0d57043fbc94df80d6590717f0b11fd3477674162c8" Feb 26 08:56:00 crc kubenswrapper[4925]: I0226 08:56:00.148060 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29534936-hwzdw"] Feb 26 08:56:00 crc kubenswrapper[4925]: E0226 08:56:00.148720 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cfa6b4d-8f11-4a07-820d-3aadcec92425" containerName="oc" Feb 26 08:56:00 crc kubenswrapper[4925]: I0226 08:56:00.148742 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cfa6b4d-8f11-4a07-820d-3aadcec92425" containerName="oc" Feb 26 08:56:00 crc kubenswrapper[4925]: I0226 08:56:00.148912 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cfa6b4d-8f11-4a07-820d-3aadcec92425" containerName="oc" Feb 26 08:56:00 crc kubenswrapper[4925]: I0226 08:56:00.149446 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534936-hwzdw" Feb 26 08:56:00 crc kubenswrapper[4925]: I0226 08:56:00.153301 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 08:56:00 crc kubenswrapper[4925]: I0226 08:56:00.153400 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zdfj4" Feb 26 08:56:00 crc kubenswrapper[4925]: I0226 08:56:00.153985 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 08:56:00 crc kubenswrapper[4925]: I0226 08:56:00.162577 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29534936-hwzdw"] Feb 26 08:56:00 crc kubenswrapper[4925]: I0226 08:56:00.189449 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9jnn\" (UniqueName: \"kubernetes.io/projected/45347466-0f81-40c8-adc3-6f52d3c85ed5-kube-api-access-p9jnn\") pod \"auto-csr-approver-29534936-hwzdw\" (UID: \"45347466-0f81-40c8-adc3-6f52d3c85ed5\") " pod="openshift-infra/auto-csr-approver-29534936-hwzdw" Feb 26 08:56:00 crc kubenswrapper[4925]: I0226 08:56:00.290649 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9jnn\" (UniqueName: \"kubernetes.io/projected/45347466-0f81-40c8-adc3-6f52d3c85ed5-kube-api-access-p9jnn\") pod \"auto-csr-approver-29534936-hwzdw\" (UID: \"45347466-0f81-40c8-adc3-6f52d3c85ed5\") " pod="openshift-infra/auto-csr-approver-29534936-hwzdw" Feb 26 08:56:00 crc kubenswrapper[4925]: I0226 08:56:00.310181 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9jnn\" (UniqueName: \"kubernetes.io/projected/45347466-0f81-40c8-adc3-6f52d3c85ed5-kube-api-access-p9jnn\") pod \"auto-csr-approver-29534936-hwzdw\" (UID: \"45347466-0f81-40c8-adc3-6f52d3c85ed5\") " pod="openshift-infra/auto-csr-approver-29534936-hwzdw" Feb 26 08:56:00 crc kubenswrapper[4925]: I0226 08:56:00.481766 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534936-hwzdw" Feb 26 08:56:00 crc kubenswrapper[4925]: I0226 08:56:00.724224 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29534936-hwzdw"] Feb 26 08:56:01 crc kubenswrapper[4925]: I0226 08:56:01.281012 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534936-hwzdw" event={"ID":"45347466-0f81-40c8-adc3-6f52d3c85ed5","Type":"ContainerStarted","Data":"9ed96e7cb47e2f0899131555210f62258fa353b1b6cb6bac2566d7df4016692d"} Feb 26 08:56:02 crc kubenswrapper[4925]: I0226 08:56:02.286986 4925 generic.go:334] "Generic (PLEG): container finished" podID="45347466-0f81-40c8-adc3-6f52d3c85ed5" containerID="3d7b7b7433118279d80ede1ad844669caf08ee42c9faf1bd95143b8e4bf2e179" exitCode=0 Feb 26 08:56:02 crc kubenswrapper[4925]: I0226 08:56:02.287070 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534936-hwzdw" event={"ID":"45347466-0f81-40c8-adc3-6f52d3c85ed5","Type":"ContainerDied","Data":"3d7b7b7433118279d80ede1ad844669caf08ee42c9faf1bd95143b8e4bf2e179"} Feb 26 08:56:03 crc kubenswrapper[4925]: I0226 08:56:03.555045 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534936-hwzdw" Feb 26 08:56:03 crc kubenswrapper[4925]: I0226 08:56:03.630262 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9jnn\" (UniqueName: \"kubernetes.io/projected/45347466-0f81-40c8-adc3-6f52d3c85ed5-kube-api-access-p9jnn\") pod \"45347466-0f81-40c8-adc3-6f52d3c85ed5\" (UID: \"45347466-0f81-40c8-adc3-6f52d3c85ed5\") " Feb 26 08:56:03 crc kubenswrapper[4925]: I0226 08:56:03.635894 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45347466-0f81-40c8-adc3-6f52d3c85ed5-kube-api-access-p9jnn" (OuterVolumeSpecName: "kube-api-access-p9jnn") pod "45347466-0f81-40c8-adc3-6f52d3c85ed5" (UID: "45347466-0f81-40c8-adc3-6f52d3c85ed5"). InnerVolumeSpecName "kube-api-access-p9jnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:56:03 crc kubenswrapper[4925]: I0226 08:56:03.731426 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9jnn\" (UniqueName: \"kubernetes.io/projected/45347466-0f81-40c8-adc3-6f52d3c85ed5-kube-api-access-p9jnn\") on node \"crc\" DevicePath \"\"" Feb 26 08:56:04 crc kubenswrapper[4925]: I0226 08:56:04.303010 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534936-hwzdw" event={"ID":"45347466-0f81-40c8-adc3-6f52d3c85ed5","Type":"ContainerDied","Data":"9ed96e7cb47e2f0899131555210f62258fa353b1b6cb6bac2566d7df4016692d"} Feb 26 08:56:04 crc kubenswrapper[4925]: I0226 08:56:04.303069 4925 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ed96e7cb47e2f0899131555210f62258fa353b1b6cb6bac2566d7df4016692d" Feb 26 08:56:04 crc kubenswrapper[4925]: I0226 08:56:04.303072 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534936-hwzdw" Feb 26 08:56:04 crc kubenswrapper[4925]: I0226 08:56:04.617619 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29534930-h9hwr"] Feb 26 08:56:04 crc kubenswrapper[4925]: I0226 08:56:04.621858 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29534930-h9hwr"] Feb 26 08:56:06 crc kubenswrapper[4925]: I0226 08:56:06.520884 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ed05424-f6d1-49dc-a527-8a32a355c6e1" path="/var/lib/kubelet/pods/2ed05424-f6d1-49dc-a527-8a32a355c6e1/volumes" Feb 26 08:56:15 crc kubenswrapper[4925]: I0226 08:56:15.872192 4925 patch_prober.go:28] interesting pod/machine-config-daemon-s6pcl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 08:56:15 crc kubenswrapper[4925]: I0226 08:56:15.872725 4925 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" podUID="52bdf0f6-9afd-42fe-9a30-7da82bc7f619" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 08:56:15 crc kubenswrapper[4925]: I0226 08:56:15.872796 4925 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" Feb 26 08:56:15 crc kubenswrapper[4925]: I0226 08:56:15.873708 4925 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e6128aecea95bb948b242d988728af1e33ed31b904f9a249c13c82124a61d7d2"} pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 08:56:15 crc kubenswrapper[4925]: I0226 08:56:15.873838 4925 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" podUID="52bdf0f6-9afd-42fe-9a30-7da82bc7f619" containerName="machine-config-daemon" containerID="cri-o://e6128aecea95bb948b242d988728af1e33ed31b904f9a249c13c82124a61d7d2" gracePeriod=600 Feb 26 08:56:16 crc kubenswrapper[4925]: I0226 08:56:16.401938 4925 generic.go:334] "Generic (PLEG): container finished" podID="52bdf0f6-9afd-42fe-9a30-7da82bc7f619" containerID="e6128aecea95bb948b242d988728af1e33ed31b904f9a249c13c82124a61d7d2" exitCode=0 Feb 26 08:56:16 crc kubenswrapper[4925]: I0226 08:56:16.402014 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" event={"ID":"52bdf0f6-9afd-42fe-9a30-7da82bc7f619","Type":"ContainerDied","Data":"e6128aecea95bb948b242d988728af1e33ed31b904f9a249c13c82124a61d7d2"} Feb 26 08:56:16 crc kubenswrapper[4925]: I0226 08:56:16.402949 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s6pcl" event={"ID":"52bdf0f6-9afd-42fe-9a30-7da82bc7f619","Type":"ContainerStarted","Data":"8fe5f15252e3ffc288ef5799d29d878f6a6807dd01064ea6ad578d3a68d9debe"} Feb 26 08:56:16 crc kubenswrapper[4925]: I0226 08:56:16.403001 4925 scope.go:117] "RemoveContainer" containerID="0b9902dc2e31bf3f83affdad185ade7bc019efcdf1759a693c00f03b44bf4d0a" Feb 26 08:56:49 crc kubenswrapper[4925]: I0226 08:56:49.195828 4925 scope.go:117] "RemoveContainer" containerID="5edaf7089152f329b3773f419a7f9d8cd6e66429bd45532fa5478b27829ec4d0" Feb 26 08:58:00 crc kubenswrapper[4925]: I0226 08:58:00.163069 4925 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29534938-2gwbx"] Feb 26 08:58:00 crc kubenswrapper[4925]: E0226 08:58:00.164497 4925 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45347466-0f81-40c8-adc3-6f52d3c85ed5" containerName="oc" Feb 26 08:58:00 crc kubenswrapper[4925]: I0226 08:58:00.164516 4925 state_mem.go:107] "Deleted CPUSet assignment" podUID="45347466-0f81-40c8-adc3-6f52d3c85ed5" containerName="oc" Feb 26 08:58:00 crc kubenswrapper[4925]: I0226 08:58:00.164644 4925 memory_manager.go:354] "RemoveStaleState removing state" podUID="45347466-0f81-40c8-adc3-6f52d3c85ed5" containerName="oc" Feb 26 08:58:00 crc kubenswrapper[4925]: I0226 08:58:00.165188 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534938-2gwbx" Feb 26 08:58:00 crc kubenswrapper[4925]: I0226 08:58:00.168416 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 08:58:00 crc kubenswrapper[4925]: I0226 08:58:00.168630 4925 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 08:58:00 crc kubenswrapper[4925]: I0226 08:58:00.168770 4925 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zdfj4" Feb 26 08:58:00 crc kubenswrapper[4925]: I0226 08:58:00.199883 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29534938-2gwbx"] Feb 26 08:58:00 crc kubenswrapper[4925]: I0226 08:58:00.320256 4925 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk62k\" (UniqueName: \"kubernetes.io/projected/5bb51f32-36c8-4181-a6bc-cadaf2c82f21-kube-api-access-hk62k\") pod \"auto-csr-approver-29534938-2gwbx\" (UID: \"5bb51f32-36c8-4181-a6bc-cadaf2c82f21\") " pod="openshift-infra/auto-csr-approver-29534938-2gwbx" Feb 26 08:58:00 crc kubenswrapper[4925]: I0226 08:58:00.422436 4925 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk62k\" (UniqueName: \"kubernetes.io/projected/5bb51f32-36c8-4181-a6bc-cadaf2c82f21-kube-api-access-hk62k\") pod \"auto-csr-approver-29534938-2gwbx\" (UID: \"5bb51f32-36c8-4181-a6bc-cadaf2c82f21\") " pod="openshift-infra/auto-csr-approver-29534938-2gwbx" Feb 26 08:58:00 crc kubenswrapper[4925]: I0226 08:58:00.450303 4925 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk62k\" (UniqueName: \"kubernetes.io/projected/5bb51f32-36c8-4181-a6bc-cadaf2c82f21-kube-api-access-hk62k\") pod \"auto-csr-approver-29534938-2gwbx\" (UID: \"5bb51f32-36c8-4181-a6bc-cadaf2c82f21\") " pod="openshift-infra/auto-csr-approver-29534938-2gwbx" Feb 26 08:58:00 crc kubenswrapper[4925]: I0226 08:58:00.503058 4925 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534938-2gwbx" Feb 26 08:58:00 crc kubenswrapper[4925]: I0226 08:58:00.691628 4925 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29534938-2gwbx"] Feb 26 08:58:01 crc kubenswrapper[4925]: I0226 08:58:01.176700 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534938-2gwbx" event={"ID":"5bb51f32-36c8-4181-a6bc-cadaf2c82f21","Type":"ContainerStarted","Data":"808260a4500b926175cb131a387801f75620e5fca307312c3917236f74f0def6"} Feb 26 08:58:02 crc kubenswrapper[4925]: I0226 08:58:02.190494 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534938-2gwbx" event={"ID":"5bb51f32-36c8-4181-a6bc-cadaf2c82f21","Type":"ContainerStarted","Data":"967c2317a12701c7e960a20b08813d8de27f750a2ada68ed310da4c8f3287897"} Feb 26 08:58:02 crc kubenswrapper[4925]: I0226 08:58:02.208521 4925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29534938-2gwbx" podStartSLOduration=1.156805497 podStartE2EDuration="2.208496804s" podCreationTimestamp="2026-02-26 08:58:00 +0000 UTC" firstStartedPulling="2026-02-26 08:58:00.699191018 +0000 UTC m=+812.838765311" lastFinishedPulling="2026-02-26 08:58:01.750882325 +0000 UTC m=+813.890456618" observedRunningTime="2026-02-26 08:58:02.203774465 +0000 UTC m=+814.343348768" watchObservedRunningTime="2026-02-26 08:58:02.208496804 +0000 UTC m=+814.348071107" Feb 26 08:58:03 crc kubenswrapper[4925]: I0226 08:58:03.198685 4925 generic.go:334] "Generic (PLEG): container finished" podID="5bb51f32-36c8-4181-a6bc-cadaf2c82f21" containerID="967c2317a12701c7e960a20b08813d8de27f750a2ada68ed310da4c8f3287897" exitCode=0 Feb 26 08:58:03 crc kubenswrapper[4925]: I0226 08:58:03.198728 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534938-2gwbx" event={"ID":"5bb51f32-36c8-4181-a6bc-cadaf2c82f21","Type":"ContainerDied","Data":"967c2317a12701c7e960a20b08813d8de27f750a2ada68ed310da4c8f3287897"} Feb 26 08:58:04 crc kubenswrapper[4925]: I0226 08:58:04.473679 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534938-2gwbx" Feb 26 08:58:04 crc kubenswrapper[4925]: I0226 08:58:04.583276 4925 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk62k\" (UniqueName: \"kubernetes.io/projected/5bb51f32-36c8-4181-a6bc-cadaf2c82f21-kube-api-access-hk62k\") pod \"5bb51f32-36c8-4181-a6bc-cadaf2c82f21\" (UID: \"5bb51f32-36c8-4181-a6bc-cadaf2c82f21\") " Feb 26 08:58:04 crc kubenswrapper[4925]: I0226 08:58:04.590997 4925 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bb51f32-36c8-4181-a6bc-cadaf2c82f21-kube-api-access-hk62k" (OuterVolumeSpecName: "kube-api-access-hk62k") pod "5bb51f32-36c8-4181-a6bc-cadaf2c82f21" (UID: "5bb51f32-36c8-4181-a6bc-cadaf2c82f21"). InnerVolumeSpecName "kube-api-access-hk62k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 08:58:04 crc kubenswrapper[4925]: I0226 08:58:04.685753 4925 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk62k\" (UniqueName: \"kubernetes.io/projected/5bb51f32-36c8-4181-a6bc-cadaf2c82f21-kube-api-access-hk62k\") on node \"crc\" DevicePath \"\"" Feb 26 08:58:05 crc kubenswrapper[4925]: I0226 08:58:05.217857 4925 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29534938-2gwbx" event={"ID":"5bb51f32-36c8-4181-a6bc-cadaf2c82f21","Type":"ContainerDied","Data":"808260a4500b926175cb131a387801f75620e5fca307312c3917236f74f0def6"} Feb 26 08:58:05 crc kubenswrapper[4925]: I0226 08:58:05.217923 4925 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="808260a4500b926175cb131a387801f75620e5fca307312c3917236f74f0def6" Feb 26 08:58:05 crc kubenswrapper[4925]: I0226 08:58:05.217945 4925 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29534938-2gwbx" Feb 26 08:58:05 crc kubenswrapper[4925]: I0226 08:58:05.275869 4925 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29534932-6n2fs"] Feb 26 08:58:05 crc kubenswrapper[4925]: I0226 08:58:05.283546 4925 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29534932-6n2fs"] Feb 26 08:58:06 crc kubenswrapper[4925]: I0226 08:58:06.517329 4925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="781a9125-5d20-412d-bf5b-5112d497934e" path="/var/lib/kubelet/pods/781a9125-5d20-412d-bf5b-5112d497934e/volumes" Feb 26 08:58:09 crc kubenswrapper[4925]: I0226 08:58:09.246846 4925 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515150005506024442 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015150005507017360 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015150003433016477 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015150003433015447 5ustar corecore